By Steve Hamm
IBM hosted the Cognitive Systems Colloquium at its famed IBM Research Center in Yorktown Heights, N.Y., on Oct. 2, 2013. The all-day event brought together leaders in science, technology and psychology to discuss the coming era of cognitive computing and to craft a shared agenda among industry, academia and government.
The following is a time-stamped stream of live updates and insights from the event from presenters including, Nobel Prize laureate Daniel Kahneman, A.I pioneer Danny Hillis, Irving Wladawsky-Berger, Visiting Professor, MIT and Imperial College, and others. Continue Reading »
By Richard Silberman, Writer/Researcher, IBM Communications
Andy Stanford-Clark built his first sensor when he was six years old to alert his mom if it started raining after she had hung the wash out to dry. His “rain detector” involved nothing more than a few copper strips on a small board that attached to the clothesline and a little box in the house that beeped, alerting her to bring in the laundry.
Already at that young age, Stanford-Clark was able to recognize a problem and solve it with a simple solution. Today, 40 years later, he is still doing the same thing, but on a much grander scale. Continue Reading »
By Chris Sciacca
IBM researcher Ton Engbersen believes scientists very soon will be able to build a computer that is comparable in complexity to the human brain. But that is only half the story. He also wants to teach such a machine to learn like the brain as well. And that is where it gets really interesting.
Ton is referring to cognitive computing – the ability of machines to sense, reason, learn, and, in some ways, think. Learning is a key element. These computers will not be based on programs that predetermine every answer or action needed to perform a task; rather, they will be trained with algorithms and through interactions with data and humans. Continue Reading »
By Dharmendra S. Modha
Sixty years ago, in the face of tremendous skepticism, IBM engineer John Backus set out to radically change the economics of scientific computing on the IBM 704 by making programming much cheaper, faster, and reliable. The language that he and his colleagues developed—FORTRAN—became the first widely used high-level programming language. It laid the groundwork for the software industry as we know it and the waves of transformation that computing has brought to industry, science, government and society. The importance of FORTRAN is hard to overestimate as demonstrated by O’Reilly’s poster on “The History of Programming Languages.”
Today, we’re at another turning point in the history of information technology. The era that Backus and his contemporaries helped create, the programmable computing era, is being superseded by the era of cognitive computing. Increasingly, computers will gather huge quantities of data, reason over the data, and learn from their interactions with information and people. These new capabilities will help us penetrate complexity and make better decisions about everything from how to manage cities to how to solve confounding business problems.
By Chris Preimesberger
For decades, companies of all industries – from technology giants to appliance makers to widget manufacturers – have designed and delivered their products in the same way: completely design the product, test it for defects, then deploy it to the market.
Unfortunately, an approach which puts the design and testing teams in separate silos just won’t work anymore. As our pace of innovation accelerates, the vast majority of our products, from automobiles to cameras, are becoming software-based. Hours and hours are poured into coding these products to run smoothly, and as their complexity increases, so does the risk of error. One single misstep in thousands of lines of code, if not more, can have results that run from frustrating to fatal. Continue Reading »
By Steve Hamm
I have always loved science, though I was never that good at it in school. So it’s a major pleasure–as well as a bit ironic–for me to reveal that I’m one of fewer than one thousand people in the world who have moved an atom.
I got to do this a few weeks back in the nanoscience lab at IBM Research-Almaden in San Jose, Calif. The lab has been a hotbed of atom moving for decades, and a small team of scientists there is now pushing the boundaries of science in hopes of producing knowledge that will help people design and build quantum computers some day.
Earlier this month IBM Research hosted the inaugural Smarter Energy Research Institute Conference (SERI) where energy and utility experts from around the globe came to share ideas and demonstrate prototype applications that shine a light on the next generation of analytics for the utility industry. Famed inventor and creator of the Segway Personal Transporter, Dean Kamen, delivered the keynote address and spoke of the need to inspire the next generation of great inventors.
The Smarter Planet blog sat down with Kamen prior to his speech to get his views on the future of the individual inventor, his pursuit of a solution to provide drinkable water to the 2 billion humans living without it, and what the utility of the future may look like. (The following is an excerpt.) Continue Reading »
IBM Research scientist Robert Dennard, who at age 80 still comes to work at the lab nearly every day, has been awarded the Kyoto Prize—one of the world’s most prestigious recognitions for personal achievement. He will receive the Advanced Technology Prize in Electronics at a November ceremony in Japan.
Dennard, an IBM Fellow, is best known for inventing the memory chip in 1967. The simplicity, low cost and low power consumption of his invention, dynamic random access memory (DRAM), opened the door to the personal computer. Today, memory chips are used in every PC, laptop computer, game console and mobile communications device. Continue Reading »
Thomas Malone, director of the MIT Center for Collective Intelligence, is one of the leading thinkers in the realm of anticipating how new technologies will transform the way work is done and leaders lead. His 2004 book, The Future of Work: How the New Order of Business Will Shape Your Organization, Your Management Style, and Your Life, helped thousands of executives and would-be executives see their organizations, and themselves, in startling new ways. As a result, many organizations are becoming more collaborative and democratic. Now, Malone is exploring how social business, data analytics and cognitive computing will transform organizations once again. Here, he talks about the revolution that is coming.
IBM: In your book The Future of Work, you talked about society being on the verge of a new world of work, a key element of which is decentralization of the organization. Since then, the social networking phenomenon has emerged and is sweeping not just popular culture but business organizations as well. How has this explosion of social networking affected your thinking? Continue Reading »
By Michael Karasick
The world is on the cusp of a new era of computing, which we call the era of cognitive systems. New computer technologies are coming that will help people and organizations penetrate complexity and make better decisions. At IBM, we believe that this coming revolution in artificial intelligence has the potential to transform the way business is done and dramatically accelerate innovation. Cognitive systems will enable humans and machines to interact together and achieve things that neither could do on their own.
The victory of IBM’s Watson on the TV game show Jeopardy! was one of the milestones in this new phase of computing. Scientists at IBM and elsewhere are pushing the boundaries of science and technology fields ranging from neural networks to machine learning to create machines that sense, learn, reason and interact with people in new ways.
(IBM Research Director John Kelly is speaking about the future of computing today at 7 p.m. Pacific Time at the Computer History Museum in Silicon Valley. His book about the new era, Smart Machines: IBM’s Watson and the Era of Cognitive Computing , will be published in the fall by Columbia University Press. To read a free chapter now, go to the Columbia University Press web site.