By Erich Clementi
In the technology world, the search is constantly on for the next big thing. People are looking to the future – trying to predict needs and trends. Start-ups strive to become the next global disruptor.
There’s a technology revolution being talked about in Europe and this one is focused largely on turning the heritage of the past into a game-changer for the future. Europe’s manufacturing sector has always been the envy of the world in delivering high quality products. Despite weathering a generation of turmoil, premium manufacturing in Europe remains a significant asset – that the three best selling luxury car brands in the world are designed and manufactured in Europe exemplifies this. Continue Reading »
By Jeff Schick
Twenty years is a long time. But it can seem like a millennium in the world of tech, especially when you consider the myriad advances that have come along over the years and the speed with which they’ve transformed business and society.
But among the revolutions and evolutions, mobile technology is arguably one of the most interesting. Not only has the device and network technologies evolved rapidly, leading to amazing scale and adoption – and transforming societies and industries, along the way – but the function and use of the device continues to change. Continue Reading »
By Masaaki Tanaka
When I came to work for IBM as a designer in the Tokyo Interactive Experience’s User Centered Design lab last September, I expected to focus on enterprise computing. But, much to my surprise, the project I’m working on now for an IBM client has me imagining the digital lifestyles of a certain class of individuals–Japan’s senior citizens.
In fact, the target customer for Japan Post’s just-launched online Watch Over service is my own father. My dad is a 75-year-old retiree who lives alone in a rural area in Saga Prefecture, in the south of Japan. He has never touched a computer. He rides a bike rather than driving a car, so he’s cut off from his friends and it takes him 20 minutes to pedal to the nearest convenience store. I hate to think what would happen if he had a medical emergency. Continue Reading »
By Mark Ritter
In 1981, Nobel Prize winner Richard Feynman challenged computer scientists to develop a new breed of computers based on quantum physics. Ever since then, scientists have been grappling with the difficulty of attaining such a grand challenge.
Employing quantum physics for computation is difficult in part because quantum information is very fragile, requiring the quantum elements to be cooled to near absolute zero temperature and shielded from electromagnetic radiation to minimize errors. This is so immensely different than our current approach to computation that the entire infrastructure of computing must be re-imagined and re-engineered.
Still, the challenges haven’t stopped physicists and computer scientists from trying, and an enormous amount of progress is being made. In fact, I believe we’re entering what will come to be seen as the golden age of quantum computing research.
By Anne Altman
While the National Oceanic and Atmospheric Administration (NOAA) was created officially in 1970, its roots go back more than 200 years. The agencies that came together to form NOAA represent some of the oldest federal agencies. So much history, so much research, so much science, so much data…so little time.
Every day, NOAA gathers more than 20 terabytes from Doppler radar systems, weather satellites, buoy networks and stations, tide gauges, real-time weather stations, ships and aircraft. That equates to creating more than twice the data contained in the United States Library of Congress – every day. Yes, data is our greatest natural resource, but like any natural resource, its power is only useful if it can be refined. Continue Reading »
By Jack Wells
Here at the Oak Ridge Leadership Computing Facility (OLCF) in East Tennessee, deploying the next top supercomputer for open science is akin to an ambitious hike in the Smoky Mountains: once one towering crest is reached, the next one appears within sight.
Just 18 months after the OLCF brought Titan—then the fastest supercomputer in the world—to full operation for users in May 2013, we announced a contract with IBM to create the next big machine: Summit.
Summit will expand on Titan’s groundbreaking hybrid architecture to deliver several times the computational power of the 27-petaflop Titan. Continue Reading »
By Maria B. Winans
When we set out to create a fun and educational program that would spotlight millennial start-ups that were founded with a social conscious to improve society, I had no idea we would encounter such intense levels of passion, commitment and clarity of thought.
But that’s exactly what we got.
This week we kick off the second phase of the New Way to Startup competition and webisode series, a five-day accelerator among five young start-ups will compete to see which one can produce the biggest breakthrough for their company using the latest social and analytics tools and leveraging expert business advice from onsite business pros. Continue Reading »
By John E. Kelly III
It’s amazing for me to recall that in 1980 when I came to IBM Research out of graduate school, engineers were striving to design chips containing 100,000 transistors–those tiny electronic switches that process and store data. Today, it’s common to put five or six billion transistors on a sliver of silicon.
This remarkable achievement is the fulfillment of a prediction made in 1965 by industry pioneer Gordon Moore: that the number of components on a chip would double every year for the foreseeable future. He later amended the time period to 24 months. His predictions, codified as Moore’s Law, have come to symbolize the seemingly inevitable march of technological progress–the ability to make all sorts of electronic devices faster, smaller and more energy efficient.
While Gordon’s prediction proved to be more prescient than he could have imagined, today, 50 years later, the chip industry is no longer able to clear the high bar he set, due largely to limits imposed by the laws of physics. To put things bluntly: Moore’s Law is hitting a wall, and that collision holds significant consequences for business and society. Unless scientists and engineers come up with bold new approaches to chip architectures and materials, technological progress will slow.
To accelerate progress, we need to invent the next switch.
By Anjul Bhambhri
It’s estimated that 2.5 quintillion bytes of data are created every day from sources such as email and collaboration tools, including posts to social media sites, digital pictures and videos, and purchase transactions, just to name a few.
As the tools for making sense of Big Data become more widely – and more expertly – applied, and the types of data that are available for analysis diversify, the opportunity to use Big Data for social good intensifies. These massive datasets can be leveraged to better serve both the billions of people who generate the data, and ultimately the societies in which they live. Continue Reading »