By Harry Kolar
New York’s Lake George is a pristine, 32-mile-long lake in the Adirondack Mountains that is noted for its water quality and clarity. While the lake is very clean, it faces multiple anthropogenic threats, including road salt incursion and several invasive species.
The Jefferson Project at Lake George, a joint research collaboration involving Rensselaer Polytechnic Institute, IBM Research, and the FUND for Lake George, is focused on protecting the lake and helping address the world’s looming freshwater supply challenges.
The project involves more than 60 scientists around the world (four IBM Research labs are involved), including biologists, computer scientists, physicists, engineers and chemists. Working as a virtual team, we’re pushing the boundaries in Internet-of-Things sensors, data analytics, and modeling of complex natural systems. Continue Reading »
By Arun Kumar
As a former entrepreneur and management consultant with deep roots in Silicon Valley, I understand from experience the importance of innovation to fostering economic growth and dynamism.
This week, I’m accompanying Secretary of Transportation Anthony Foxx on a U.S. trade mission to Africa to assist U.S. companies with doing business in Sub-Saharan Africa.
During the past week, I have seen first-hand how the spirit of entrepreneurship is thriving across the region – from South Africa, to Mozambique, to Kenya where I am today. Dozens of tech hubs are popping up in African cities. Continue Reading »
By Steve Hamm
Chief Storyteller, IBM
Wendy Hite is a bit of a food snob. She grew up in South West Louisiana, where food and family are all mixed up in the great gumbo of life, and, for the longest time, she couldn’t imagine how she could improve on traditional Cajun-style cooking.
Until she met Chef Watson, that is.
She used the cognitive cooking discovery program to develop a crawfish deviled egg dish that was mighty tasty–familiar, in some ways, but also new to her. “This has been fun,” she says. “It gets you to try new things and to be more creative than you normally would be.” Continue Reading »
By Ron Ambrosio
You walk into a room at night and flip the light switch on the wall. The lights come on. You didn’t think twice about that …you were certain it would work. While we’re not at that point everywhere in the world yet, it is true of most industrialized regions that electricity is a highly reliable resource. But the reality behind that simple action of turning on a light switch is a constantly evolving list of uncertainties that utilities deal with 24/7.
Uncertainty takes many forms in the utility industry, from the health of individual devices as they age, to volatility of fuel prices, to the behavior of you, the consumer, and your use of electricity or natural gas. And uncertainty can be equated to risk — the risk of failing to achieve both operational and business objectives. That’s not a risk any business wants to take. Continue Reading »
By Dr. Daniel Oehme
Over the millennia our ability to utilise plants in many different ways has allowed us to flourish as a species. Most importantly, they turn our waste carbon dioxide into oxygen.
But we have also used plants to provide shelter, to publish and transmit information on paper and as a food source. In fact, developing new ways to utilise plants has even led to population explosions throughout time, such as when we first developed granaries to store grain thousands of years ago. In these modern times of climate change, global warming, ever-increasing populations and fossil fuels, plants have never been more important. Continue Reading »
By Arvind Krishna
Chemists at Unilever, the Anglo-Dutch consumer products giant, used to spend up to three months in their laboratories creating new formulations for liquid cleaning products. Now, they can perform the same work in 45 minutes or less–thanks to a collaboration between Unilever, one of the United Kingdom’s national laboratories and IBM.
Unilever product developers use iPads to set up tests and experiments, run simulations on an IBM Blue Gene/Q supercomputer at the UK’s Hartree Centre lab, and see their results in 3D visualizations that help them explore the data and make discoveries that otherwise might elude them.
This is an example of what’s possible when government, businesses and tech companies combine forces to bring the power of supercomputing and sophisticated data analytics to bear on business problems. It’s also an example of the kind of collaboration I expect to see flourish as a result of an agreement IBM is announcing today with Britain’s Science & Technology Facility Council.
By Mark Ritter
In 1981, Nobel Prize winner Richard Feynman challenged computer scientists to develop a new breed of computers based on quantum physics. Ever since then, scientists have been grappling with the difficulty of attaining such a grand challenge.
Employing quantum physics for computation is difficult in part because quantum information is very fragile, requiring the quantum elements to be cooled to near absolute zero temperature and shielded from electromagnetic radiation to minimize errors. This is so immensely different than our current approach to computation that the entire infrastructure of computing must be re-imagined and re-engineered.
Still, the challenges haven’t stopped physicists and computer scientists from trying, and an enormous amount of progress is being made. In fact, I believe we’re entering what will come to be seen as the golden age of quantum computing research.
By John E. Kelly III
It’s amazing for me to recall that in 1980 when I came to IBM Research out of graduate school, engineers were striving to design chips containing 100,000 transistors–those tiny electronic switches that process and store data. Today, it’s common to put five or six billion transistors on a sliver of silicon.
This remarkable achievement is the fulfillment of a prediction made in 1965 by industry pioneer Gordon Moore: that the number of components on a chip would double every year for the foreseeable future. He later amended the time period to 24 months. His predictions, codified as Moore’s Law, have come to symbolize the seemingly inevitable march of technological progress–the ability to make all sorts of electronic devices faster, smaller and more energy efficient.
While Gordon’s prediction proved to be more prescient than he could have imagined, today, 50 years later, the chip industry is no longer able to clear the high bar he set, due largely to limits imposed by the laws of physics. To put things bluntly: Moore’s Law is hitting a wall, and that collision holds significant consequences for business and society. Unless scientists and engineers come up with bold new approaches to chip architectures and materials, technological progress will slow.
To accelerate progress, we need to invent the next switch.
By Chris Nay
On average, IBM bestows its top technical rank of Fellow upon only five employees per year. That adds up to 257 who have earned the title over the program’s 52 year history. And those who hold it are recognized not only across IBM, but throughout the industry and around the world for leading innovation that will change the future.
Fellows program founder, Gardiner Tucker, recently recalled an example of the impact and scope of the program when discussing the work of Nathaniel Rochester, Fellow class of 1967. Continue Reading »