By Guruduth Banavar
With thousands of scientists, engineers, and business leaders focused on cognitive computing across IBM Research and the IBM Watson Group, IBM is pursuing the most comprehensive effort in the tech industry to advance into the new era of computing. Nobody has more people on it, a broader array of research and development projects nor deeper expertise in so many of the most significant fields of inquiry.
Yet we understand that to accelerate progress in cognitive computing, we can’t do this alone. That’s why IBM has been pursuing a strategy of forming deep collaborative partnerships with academic scientists who are among the leaders in their fields as well as opening Watson as a technology platform for others to build on. Continue Reading »
Humans have long dreamed of creating machines that think. More than 100 years before the first programmable computer was built, inventors wondered whether devices made of rods and gears might become intelligent. And when Alan Turing, one of the pioneers of computing in the 1940s, set a goal for computer science, he described a test, later dubbed the Turing Test, which measured a computer’s performance against the behavior of humans.
In the early days of my academic field, artificial intelligence, scientists tackled problems that were difficult for humans but relatively easy for computers–such as large-scale mathematical calculations. In more recent years, we’re taking on tasks that are easy for people to perform but hard to describe to a machine–tasks humans solve “without thinking,” such as recognizing spoken words or faces in a crowd. Continue Reading »
By Mukesh Khare
It’s an important moment in the history of the electronics industry. Researchers from IBM Research, SUNY Polytechnic Institute’s Colleges of Nanotech Science + Engineering and partners including GlobalFoundries and Samsung have produced advances that will enable the semiconductor industry to pack about twice as many transistors on the chips that power everything from data-crunching servers to mobile devices.
Working together, we achieved an industry first–producing working test chips at New York’s SUNY NanoTech Complex near Albany whose smallest features approach 7 nanometers. As a result, the industry will be able to place more than 20 billion tiny switches on chips the size of a fingernail.
By Harry Kolar
New York’s Lake George is a pristine, 32-mile-long lake in the Adirondack Mountains that is noted for its water quality and clarity. While the lake is very clean, it faces multiple anthropogenic threats, including road salt incursion and several invasive species.
The Jefferson Project at Lake George, a joint research collaboration involving Rensselaer Polytechnic Institute, IBM Research, and the FUND for Lake George, is focused on protecting the lake and helping address the world’s looming freshwater supply challenges.
The project involves more than 60 scientists around the world (four IBM Research labs are involved), including biologists, computer scientists, physicists, engineers and chemists. Working as a virtual team, we’re pushing the boundaries in Internet-of-Things sensors, data analytics, and modeling of complex natural systems. Continue Reading »
By Arun Kumar
As a former entrepreneur and management consultant with deep roots in Silicon Valley, I understand from experience the importance of innovation to fostering economic growth and dynamism.
This week, I’m accompanying Secretary of Transportation Anthony Foxx on a U.S. trade mission to Africa to assist U.S. companies with doing business in Sub-Saharan Africa.
During the past week, I have seen first-hand how the spirit of entrepreneurship is thriving across the region – from South Africa, to Mozambique, to Kenya where I am today. Dozens of tech hubs are popping up in African cities. Continue Reading »
By Steve Hamm
Chief Storyteller, IBM
Wendy Hite is a bit of a food snob. She grew up in South West Louisiana, where food and family are all mixed up in the great gumbo of life, and, for the longest time, she couldn’t imagine how she could improve on traditional Cajun-style cooking.
Until she met Chef Watson, that is.
She used the cognitive cooking discovery program to develop a crawfish deviled egg dish that was mighty tasty–familiar, in some ways, but also new to her. “This has been fun,” she says. “It gets you to try new things and to be more creative than you normally would be.” Continue Reading »
By Ron Ambrosio
You walk into a room at night and flip the light switch on the wall. The lights come on. You didn’t think twice about that …you were certain it would work. While we’re not at that point everywhere in the world yet, it is true of most industrialized regions that electricity is a highly reliable resource. But the reality behind that simple action of turning on a light switch is a constantly evolving list of uncertainties that utilities deal with 24/7.
Uncertainty takes many forms in the utility industry, from the health of individual devices as they age, to volatility of fuel prices, to the behavior of you, the consumer, and your use of electricity or natural gas. And uncertainty can be equated to risk — the risk of failing to achieve both operational and business objectives. That’s not a risk any business wants to take. Continue Reading »
By Dr. Daniel Oehme
Over the millennia our ability to utilise plants in many different ways has allowed us to flourish as a species. Most importantly, they turn our waste carbon dioxide into oxygen.
But we have also used plants to provide shelter, to publish and transmit information on paper and as a food source. In fact, developing new ways to utilise plants has even led to population explosions throughout time, such as when we first developed granaries to store grain thousands of years ago. In these modern times of climate change, global warming, ever-increasing populations and fossil fuels, plants have never been more important. Continue Reading »
By Arvind Krishna
Chemists at Unilever, the Anglo-Dutch consumer products giant, used to spend up to three months in their laboratories creating new formulations for liquid cleaning products. Now, they can perform the same work in 45 minutes or less–thanks to a collaboration between Unilever, one of the United Kingdom’s national laboratories and IBM.
Unilever product developers use iPads to set up tests and experiments, run simulations on an IBM Blue Gene/Q supercomputer at the UK’s Hartree Centre lab, and see their results in 3D visualizations that help them explore the data and make discoveries that otherwise might elude them.
This is an example of what’s possible when government, businesses and tech companies combine forces to bring the power of supercomputing and sophisticated data analytics to bear on business problems. It’s also an example of the kind of collaboration I expect to see flourish as a result of an agreement IBM is announcing today with Britain’s Science & Technology Facility Council.
By Mark Ritter
In 1981, Nobel Prize winner Richard Feynman challenged computer scientists to develop a new breed of computers based on quantum physics. Ever since then, scientists have been grappling with the difficulty of attaining such a grand challenge.
Employing quantum physics for computation is difficult in part because quantum information is very fragile, requiring the quantum elements to be cooled to near absolute zero temperature and shielded from electromagnetic radiation to minimize errors. This is so immensely different than our current approach to computation that the entire infrastructure of computing must be re-imagined and re-engineered.
Still, the challenges haven’t stopped physicists and computer scientists from trying, and an enormous amount of progress is being made. In fact, I believe we’re entering what will come to be seen as the golden age of quantum computing research.