You can help design this world of the future—where machines learn, reason and interact with people in ways that are more natural to us. As a scientist, an engineer, a marketer, or an entrepreneur, your skills and ideas will be essential for inventing the new era. For consumers of technology, social networking gives you a seat at the table where the future is being designed. Your voices will shape the thinking of technologists and the services they offer to you. This year’s 5 in 5 predictions of innovations that will help transform your world is just a taste of what is to come.
To stimulate the conversation between technology creator and consumer, we’re calling on readers to suggest their own novel and perhaps even earth-shifting ideas for putting cognitive systems to work on everybody’s behalf. If you have an idea that gets you jazzed, please submit it as a comment at the end of this blog post. We’ll review the comments and highlight the best of them in future posts. The people with the most intriguing ideas get free Watson T-Shirts!
To learn more about the new era of computing, read Smart Machines: IBM’s Watson and the Era of Cognitive Computing.
Check your calendars for tomorrow morning, and plan on coming back and viewing this year’s 5 in 5, predictions of five innovations that will rock your world within five years. Chosen by IBM Research scientists, this year’s innovations are rooted not in gee-whiz visions of the future but in projects we have underway in our labs today.
Each of the predictions will explore an aspect of one of the most important changes that’s coming to computing–the ability of machines to learn from their interactions with data and people. Such learning is part of the era of cognitive computing, which got its start with Watson’s victory on the TV quiz show Jeopardy! In the future, machines will increasingly learn, reason, predict the future and interact with people in ways that are more natural to us. Continue Reading »
By Uyi Stewart
In an interview with Wired magazine, the English musician, Brian Eno, complained that there is not enough Africa in computers.
“How does one Africanize…or otherwise liberate a computer?” he wanted to know.
Maybe Brian would like to visit us at our new research laboratory in Nairobi, because this is more or less what we are doing. Although our focus is not to build computers, per se, we are building technology solutions for Africa— with uniquely African flavour. Africanized solutions, if you like.
IBM Research—Africa, officially opens its doors next week. It’s our 12th global research laboratory, and the first in Africa. It feels like a pivotal moment. It certainly is for me. Continue Reading »
By Laura Haas
One year ago, when I was talking to medical researchers in Texas about potential research collaborations, I experienced one of those great aha! moments that scientists live for. I had mentioned work we were doing in using text analytics on the medical literature to accelerate drug discovery. One of the researchers I was speaking to connected the dots between that project and an element of IBM’s Watson technology—the ability for Watson to generate hypotheses.
He said: Why not combine these technologies to help predict the next promising experiment that could be undertaken in any line of scientific inquiry? Out of that revelation came a deep collaboration between Baylor College of Medicine and IBM to accelerate the discovery of new drugs to treat and cure diseases such as cancer, Alzheimer’s and ALS. Continue Reading »
By Steve Hamm
IBM hosted the Cognitive Systems Colloquium at its famed IBM Research Center in Yorktown Heights, N.Y., on Oct. 2, 2013. The all-day event brought together leaders in science, technology and psychology to discuss the coming era of cognitive computing and to craft a shared agenda among industry, academia and government.
The following is a time-stamped stream of live updates and insights from the event from presenters including, Nobel Prize laureate Daniel Kahneman, A.I pioneer Danny Hillis, Irving Wladawsky-Berger, Visiting Professor, MIT and Imperial College, and others. Continue Reading »
By Richard Silberman, Writer/Researcher, IBM Communications
Andy Stanford-Clark built his first sensor when he was six years old to alert his mom if it started raining after she had hung the wash out to dry. His “rain detector” involved nothing more than a few copper strips on a small board that attached to the clothesline and a little box in the house that beeped, alerting her to bring in the laundry.
Already at that young age, Stanford-Clark was able to recognize a problem and solve it with a simple solution. Today, 40 years later, he is still doing the same thing, but on a much grander scale. Continue Reading »
By Chris Sciacca
IBM researcher Ton Engbersen believes scientists very soon will be able to build a computer that is comparable in complexity to the human brain. But that is only half the story. He also wants to teach such a machine to learn like the brain as well. And that is where it gets really interesting.
Ton is referring to cognitive computing – the ability of machines to sense, reason, learn, and, in some ways, think. Learning is a key element. These computers will not be based on programs that predetermine every answer or action needed to perform a task; rather, they will be trained with algorithms and through interactions with data and humans. Continue Reading »
By Dharmendra S. Modha
Sixty years ago, in the face of tremendous skepticism, IBM engineer John Backus set out to radically change the economics of scientific computing on the IBM 704 by making programming much cheaper, faster, and reliable. The language that he and his colleagues developed—FORTRAN—became the first widely used high-level programming language. It laid the groundwork for the software industry as we know it and the waves of transformation that computing has brought to industry, science, government and society. The importance of FORTRAN is hard to overestimate as demonstrated by O’Reilly’s poster on “The History of Programming Languages.”
Today, we’re at another turning point in the history of information technology. The era that Backus and his contemporaries helped create, the programmable computing era, is being superseded by the era of cognitive computing. Increasingly, computers will gather huge quantities of data, reason over the data, and learn from their interactions with information and people. These new capabilities will help us penetrate complexity and make better decisions about everything from how to manage cities to how to solve confounding business problems.
By Chris Preimesberger
For decades, companies of all industries – from technology giants to appliance makers to widget manufacturers – have designed and delivered their products in the same way: completely design the product, test it for defects, then deploy it to the market.
Unfortunately, an approach which puts the design and testing teams in separate silos just won’t work anymore. As our pace of innovation accelerates, the vast majority of our products, from automobiles to cameras, are becoming software-based. Hours and hours are poured into coding these products to run smoothly, and as their complexity increases, so does the risk of error. One single misstep in thousands of lines of code, if not more, can have results that run from frustrating to fatal. Continue Reading »
By Steve Hamm
I have always loved science, though I was never that good at it in school. So it’s a major pleasure–as well as a bit ironic–for me to reveal that I’m one of fewer than one thousand people in the world who have moved an atom.
I got to do this a few weeks back in the nanoscience lab at IBM Research-Almaden in San Jose, Calif. The lab has been a hotbed of atom moving for decades, and a small team of scientists there is now pushing the boundaries of science in hopes of producing knowledge that will help people design and build quantum computers some day.