One of the most intriguing research projects at the Almaden lab over the past decade has been the development of a neurosynaptic microchip modeled on the workings of the brain. Funded since 2008 by the U.S. Defense Advanced Research Projects Agency’s SyNAPSE initiative, a team at Almaden led by Dharmendra S. Modha created not only a radically new chip architecture but a new approach to creating software applications.
Tomorrow, their work begins the transition from a science research project to a technology that’s on its way into the commercial marketplace.
At an all-day IBM Research Cognitive Systems Colloquium focused on brain-inspired computing, we’ll lay out our technology vision and invite academics, entrepreneurs and established companies to participate in a science and business ecosystem that we hope will ultimately help transform the landscape of computing.
Colloquium speakers and panelists will include Karlheinz Meier, co-director of Europe’s Human Brain Project; Fei-Fei Li, director of Stanford’s A.I. lab; and Horst Simon, deputy director of Lawrence Berkeley National Laboratory.
To find out what’s happening at the colloquium in near-realtime, tune in to a live blog from the Almaden lab and join the Twitter conversation at #CognitiveComputing.
Most of today’s computers are based on the architecture established by John von Neumann in the 1940s. They’re good at math and text processing, but they’re not very efficient when it comes to interpreting the physical universe.
In contrast, our brain-inspired technology addresses a wide range of sense-based inputs, including vision, hearing, motion, and temperature. It’s designed to be especially effective in situations where computing is constrained by power and space and yet must deliver real-time sense and response. Think robots, self-parking cars, sensor networks on gas pipelines and wind farms, and public safety monitoring applications—all those Internet-of-things scenarios. And think about your smartphone being used as a mobile sensing device. This will open up a whole new realm of access to information that we didn’t have in the past.
In addition, the technology can be used in combination with other cognitive computing technologies such as IBM Watson to create systems that learn, reason and help humans make better decisions. Metaphorically, it’s the right-brain counterpart to Watson’s left-brain intelligence. It will be Watson’s eyes and ears.
Each neurosynaptic chip contains one million programmable neurons and 256 million synapses. What’s more, by tiling dozens or hundreds of such chips on circuit boards, you can rapidly scale to supercomputing-class capabilities. Imagine being able to analyze all the data being received from outer space via every radio telescope in the world—and in real time.
The point isn’t to copy or match human intelligence; it’s to use the brain as a model to develop more capable computers.
Just as important as the data-processing capability of the chip is its energy efficiency. An individual neurosynaptic processor consumes just 70 milliwatts of electricity. That’s comparable to the energy supplied by a hearing aid battery. Computers based on the chip will be many orders of magnitude more energy-efficient than conventional systems for sensory-based uses.
Over the coming months and years, IBM will continue to advance the technology. We’ll produce new generations of chips, better algorithms, better programming tools, and entirely new applications. This will increase the processing power and energy efficiency; increase scale and density; reduce costs; and increase productivity—for IBM and for anybody else who wants to use the technology.
In addition, we’re developing an ecosystem that will help propel the technology forward and provide a platform upon which IBM and other companies can create world-changing applications. Think of the ecosystem Apple built around its iPhone, iPad and iOS technology.
For starters, we’re:
–Providing chips, circuit boards and software to select academic researchers so they can develop complimentary technologies.
–Offering access to application tools, programming interfaces and a simulator via a cloud service to companies interested in developing applications.
–Forming partnerships with companies to jointly develop applications in vertical industries.
–Offering on-line training courses for technologists and business leaders so they understand the potential of TrueNorth.
We have set up a Web site to serve as the virtual headquarters for the ecosystem. We welcome anybody who is interested in participating, including academic and government researchers, students, entrepreneurs, and business leaders, as well as IBMers. It will require a global community of individuals and organizations to fulfill the great promise of brain-inspired computing.
When I look into the future, I foresee dozens of uses for this new wave of technology—everything from glasses that will enable blind people to “see” to security systems that protect all of us from terrorist threats.
Personally, the application that’s most appealing to me is the self-driving car. I’m not talking about today’s early experiments. Those machines aren’t capable of thinking and learning. They operate by combining elaborate rules of the road with a host of visual sensors. But I can imagine a future where we combine sophisticated sensor networks with our chip’s pattern-recognition capabilities and IBM Watson’s ability to learn from experiences. The result will be cars that anticipate problems before they happen and respond perfectly to surprises.
I live on a mountaintop in California’s bay area. When I walk my dog, I take in a panoramic view that stretches from San Francisco’s gleaming skyline to San Jose’s golden hills. But there’s a tradeoff. I have to commute to work 60 miles by car every day. I would love to slip into a self-driving car and be carried to work by a smart system that would allow me to safely take in the scenery, do email or participate in a video conference. That’s my idea of a great commute.
Brain-inspired computing is so new that we haven’t even scratched the surface when it comes to envisioning the uses we’ll be able to put it to in the years ahead. (Did scientists in 1946 working on ENIAC foresee the smartphone revolution? Probably not.) I look forward to working with a host of creative people and organizations to help invent computing’s future and to make the world a better place.
To learn more about the new era of computing, read Smart Machines: IBM’s Watson and the Era of Cognitive Computing.