By Dharmendra S. Modha
For decades, computer scientists have been pursuing two elusive goals in parallel: engineering energy-efficient computers modeled on the human brain and designing smart computing systems that learn on their own—like humans do—and are not programmed like today’s computers. Both goals are now within reach.
And, today, as we launch our ecosystem for brain-inspired computing with a TrueNorth Boot Camp for academic and government researchers, I expect that the two quests will begin to converge. By the end of the intensive three-week training program, hopefully, early adopters will set out to show potential for these new technologies to transform industries and society.
The boot camp is a pivotal step in bringing brain-inspired computing to society by putting cutting-edge innovation in the hands of some of the best and brightest researchers who will begin to invent a wealth of applications and systems that we cannot even imagine today.
With the SyNAPSE Project at IBM Research, in collaboration with Cornell University, we have produced the TrueNorth neuromorphic chip, which is capable of super-efficient processing of massive amounts of sense-based data—from photographs, videos, sensors, scientific and medical imagery, sounds, speech, and touch.
At the same time, leading computer scientists are producing major breakthroughs in the domain of Deep Networks—developing systems that rely less on manual training by people and more on learning–primarily through interacting with images, sounds, text, and other kinds of information.
The TrueNorth chip provides a hardware platform optimized for delivering Deep Learning and other computing tasks related to big data and sense-based information in an energy-, volume-, and speed-efficient fashion. Our suite of software tools enables computer scientists to build applications that run efficiently on the hardware by taking into account architectural constraints, which, quite unexpectedly, suggest entirely novel algorithms. The ecosystem will enable researchers to map the entire field of neural networks to our energy-efficient platform.
If you’re a government or university researcher and you’re interested in participating in the next boot camp, contact my colleague, Ben Shaw at firstname.lastname@example.org. Naturally, I’m also interested in hearing from companies that are interested in partnering to exploit the commercial potential of our platform.
For my colleagues and me, this is yet another watershed moment in a hugely ambitious project that has been underway since 2008 (see, Phase 0, Phase 1, Phase 2, Phase 3, the “TrueNorth” chip for other important moments). My team, composed of IBM scientists and academic research partners, was initially funded by the U.S. Defense Advanced Research Projects Agency (DARPA) under the SyNAPSE Program to the tune of nearly $53.5M. Our task was to develop a chip design modeled on the human brain that was highly energy efficient and that could be scaled to handle some of the largest of data processing challenges.
We accomplished that ambitious goal last year, on schedule. Each TrueNorth chip contains one million programmable neurons and 256 million configurable synapses. An individual processor consumes just 70 milliwatts of electricity—such a small amount that it could run on a smartphone battery for a week.
Within a few years, we’ll be able to produce computing systems containing a few dozen chips that possess the computing power of a mammalian brain. I call it a brain-in-a-box.
But first things first. My team of 30 scientists has spent the past year building a set of software tools, including a programming language and a library of pre-built components, with which developers can build applications that run on TrueNorth circuit boards—and, ultimately, on complete computer systems. To dramatically increase programmer productivity, we have built a back-end to a widely used Deep Learning framework, Caffe. A variety of neural networks and deep learning algorithms can now run on our platform.
We have already signed new research and development agreements with DARPA and the US Air Force, and their scientists will participate in the boot camp. In addition, we’re expecting researchers from some of the U.S. national laboratories (Lawrence Livermore, Lawrence Berkeley and Argonne National Labs) and a number of elite universities (including Caltech; ETH Zurich; Johns Hopkins University; Imperial College, London; National University of Singapore; RPI; and University of California at Davis, Los Angeles, San Diego, and Santa Cruz.)
At the boot camp sessions, we will walk participants through the step-by-step process of building applications on a new platform with new development tools. We’re giving them the Lego building blocks, and I’m looking forward to seeing what they create. The possibilities are endless because permutations and combinations of the underlying building blocks are limited only by human imagination and creativity.
I can imagine a rainbow of multi-sensory applications based on this new style of computing opening up a whole new frontier. Embedded processors will enable real-time contextual understanding in automobiles, robots, and cameras. Synaptic supercomputers will crunch data drawn from a wide variety of sensors and sources. Cloud centers will process massive steams of medical and scientific data for real-time analysis.
One of the most exciting areas is smartphones and wearables. For starters, the energy-conscious chips will enable smartphone makers to incorporate facial recognition so you never have to type in a password. Your handheld devices will be able to understand what you like, where you are and what you’re doing so they can offer you convenient services, such as suggestions of a lunch spot you are sure to like, or vital alerts, such as warnings of health or safety concerns.
Sixty years ago, IBM Fellow John Backus set out to radically change the economics of computing by making programming much cheaper, faster, and more reliable. The language that he and his colleagues developed—FORTRAN—became the first widely used high-level programming language. It laid the groundwork for the software industry as we know it and the waves of transformation that computing has brought to industry, science, government and society.
A similar shift is underway today. The era that Backus and his contemporaries helped create, the programmable computing era, is being superseded by the era of cognitive computing. Increasingly, computers will gather huge quantities of different kinds of data, reason over the data, and, using techniques such as Deep Learning, learn from their interactions with information and people. These new capabilities will help us penetrate complexity and make better decisions about everything from how to manage cities to how to solve confounding business problems.
However, in order for cognitive computing to take hold, scientists will have to re-architect nearly every aspect of computing—spanning silicon, systems, development tools, storage, and software. My team is doing its part with the IBM Neurosynaptic System and development environment. These are the beginning steps in an effort to build a large, open, and collaborative ecosystem of organizations and individuals who are committed to inventing the next era of computing.
To learn more about the new era of computing, read Smart Machines: IBM’s Watson and the Era of Cognitive Computing.