By Dharmendra S. Modha
Sixty years ago, in the face of tremendous skepticism, IBM engineer John Backus set out to radically change the economics of scientific computing on the IBM 704 by making programming much cheaper, faster, and reliable. The language that he and his colleagues developed—FORTRAN—became the first widely used high-level programming language. It laid the groundwork for the software industry as we know it and the waves of transformation that computing has brought to industry, science, government and society. The importance of FORTRAN is hard to overestimate as demonstrated by O’Reilly’s poster on “The History of Programming Languages.”
Today, we’re at another turning point in the history of information technology. The era that Backus and his contemporaries helped create, the programmable computing era, is being superseded by the era of cognitive computing. Increasingly, computers will gather huge quantities of data, reason over the data, and learn from their interactions with information and people. These new capabilities will help us penetrate complexity and make better decisions about everything from how to manage cities to how to solve confounding business problems.
By Chris Preimesberger
For decades, companies of all industries – from technology giants to appliance makers to widget manufacturers – have designed and delivered their products in the same way: completely design the product, test it for defects, then deploy it to the market.
Unfortunately, an approach which puts the design and testing teams in separate silos just won’t work anymore. As our pace of innovation accelerates, the vast majority of our products, from automobiles to cameras, are becoming software-based. Hours and hours are poured into coding these products to run smoothly, and as their complexity increases, so does the risk of error. One single misstep in thousands of lines of code, if not more, can have results that run from frustrating to fatal. Continue Reading »
By Steve Hamm
I have always loved science, though I was never that good at it in school. So it’s a major pleasure–as well as a bit ironic–for me to reveal that I’m one of fewer than one thousand people in the world who have moved an atom.
I got to do this a few weeks back in the nanoscience lab at IBM Research-Almaden in San Jose, Calif. The lab has been a hotbed of atom moving for decades, and a small team of scientists there is now pushing the boundaries of science in hopes of producing knowledge that will help people design and build quantum computers some day.
Earlier this month IBM Research hosted the inaugural Smarter Energy Research Institute Conference (SERI) where energy and utility experts from around the globe came to share ideas and demonstrate prototype applications that shine a light on the next generation of analytics for the utility industry. Famed inventor and creator of the Segway Personal Transporter, Dean Kamen, delivered the keynote address and spoke of the need to inspire the next generation of great inventors.
The Smarter Planet blog sat down with Kamen prior to his speech to get his views on the future of the individual inventor, his pursuit of a solution to provide drinkable water to the 2 billion humans living without it, and what the utility of the future may look like. (The following is an excerpt.) Continue Reading »
IBM Research scientist Robert Dennard, who at age 80 still comes to work at the lab nearly every day, has been awarded the Kyoto Prize—one of the world’s most prestigious recognitions for personal achievement. He will receive the Advanced Technology Prize in Electronics at a November ceremony in Japan.
Dennard, an IBM Fellow, is best known for inventing the memory chip in 1967. The simplicity, low cost and low power consumption of his invention, dynamic random access memory (DRAM), opened the door to the personal computer. Today, memory chips are used in every PC, laptop computer, game console and mobile communications device. Continue Reading »
Thomas Malone, director of the MIT Center for Collective Intelligence, is one of the leading thinkers in the realm of anticipating how new technologies will transform the way work is done and leaders lead. His 2004 book, The Future of Work: How the New Order of Business Will Shape Your Organization, Your Management Style, and Your Life, helped thousands of executives and would-be executives see their organizations, and themselves, in startling new ways. As a result, many organizations are becoming more collaborative and democratic. Now, Malone is exploring how social business, data analytics and cognitive computing will transform organizations once again. Here, he talks about the revolution that is coming.
IBM: In your book The Future of Work, you talked about society being on the verge of a new world of work, a key element of which is decentralization of the organization. Since then, the social networking phenomenon has emerged and is sweeping not just popular culture but business organizations as well. How has this explosion of social networking affected your thinking? Continue Reading »
By Michael Karasick
The world is on the cusp of a new era of computing, which we call the era of cognitive systems. New computer technologies are coming that will help people and organizations penetrate complexity and make better decisions. At IBM, we believe that this coming revolution in artificial intelligence has the potential to transform the way business is done and dramatically accelerate innovation. Cognitive systems will enable humans and machines to interact together and achieve things that neither could do on their own.
The victory of IBM’s Watson on the TV game show Jeopardy! was one of the milestones in this new phase of computing. Scientists at IBM and elsewhere are pushing the boundaries of science and technology fields ranging from neural networks to machine learning to create machines that sense, learn, reason and interact with people in new ways.
(IBM Research Director John Kelly is speaking about the future of computing today at 7 p.m. Pacific Time at the Computer History Museum in Silicon Valley. His book about the new era, Smart Machines: IBM’s Watson and the Era of Cognitive Computing , will be published in the fall by Columbia University Press. To read a free chapter now, go to the Columbia University Press web site.
By Takreem El-Tohamy
There’s a wonderful word in Swahili that I think expresses one of the imperatives for the future of Africa. The word is “harambee.” It means pulling together, collaborating and supporting each other. I believe that one of the key factors in the ability of African countries to create sustainable and equitable economic growth will be the emergence of innovation ecosystems. Harambee perfectly captures an essential element of such ecosystems—the ability of institutions and individuals to pull together and build a mutually supportive environment.
Innovation ecosystems are complex organisms that are difficult to create yet tremendously powerful when they work. Think Silicon Valley. They require a melding of all of the capabilities of governments, businesses, financiers, universities, and individuals. Together, these organizations and individuals provide the web of support that makes it easier for startups to launch and grow quickly, and for established companies to innovate more aggressively. With that kind of support, African entrepreneurs and businesses will find it easier to produce new products and services, or even create whole new industries. You can think of an innovation ecosystem as a collective intelligence—harnessed for the good of society. Continue Reading »
By Steve Hamm
When Brenda Dietrich joined IBM with a newly-minted PhD in operations research 30 years ago, she ran into a buzz saw of ignorance about the role that math could play in business. She offered her expertise to an IBM manufacturing group in Poughkeepsie, New York, but was rebuffed. The only way they could use her math skills, they told her mockingly, was in helping to balance their checkbooks. “We’ve come a long way in the recognition of the value of math and analytics,” says Dietrich, CTO of IBM’s Business Analytics division.
Today, math and data analytics are seen as essential elements for businesses and other organizations when it comes to understanding how the world works, predicting the future and making better decisions. In this world of Big Data, the Internet of Things and social networks, organizations use math to help improve everything from operations and finances to their understanding of customers, employees and the interactions of physical and social systems. As data about all manner of things becomes more readily available and has computers become ever more powerful, we are at last able to deal with complexity and uncertainty, and, as IBM Watson’s victory on the TV quiz show Jeopardy showed, we can create machines that think. Continue Reading »
By Andreas Heinrich
It wasn’t my idea to make the world’s smallest movie, but I’m really glad we did. I hope that the two-minute video animation, A Boy and his Atom, which was made by painstakingly moving individual atoms on a microscopic surface to create simple images of a boy and his world, will inspire young people everywhere to study science and to seek careers in science and technology. Working with artists and animators, my team at the Almaden lab put 10,000 atoms in place in a 10-day work marathon.
Here’s the Atom movie, which was made public today: