No computing paradigm lasts forever, so new approaches must be found to support the next phase of computing: learning systems. IBM Research’s colloquium, Frontiers of IT, being held today at the Watson Lab in Yorktown Heights, N.Y., explores disruptive innovations that could utterly transform the industry over the next two decades.
To Tweet or follow the commentary on Twitter, use #ITfrontiers.
4:15 pm – 5:15 pm Panel: The Next Grand Challenges in Computing
What is it? How should it be design and conducted?
Jim Schatz, Johns Hopkins
Applied Information Sciences
Former director, R&D, National Security Agency
My challenge for IBM Reserach is to put together a complete and formal proof for Fermat’s Last Theorem, one of the major theories of mathematics.
(Fermat’s Last Theorem states that no three positive integers a, b, and c can satisfy the equation an + bn = cn for any integer value of n greater than two.)
We need computers to generate and prove new theories. “Where we’re going is the fully automated mathematician.”
“It’s the next generation of mathematics, and the implications for humanity will be profound.”
He’s asked: The implications for business and society?
“Can’t say. But every bit of mathematics that has been invented has been applied. The mathematics that comes out of this will get past the limitations we see in computing. You’ll have to trust me on this one: If we build it, they will come.”
Flybridge Venture Capital
IBM is the eco-friendly company, applying IT to smarter cities and a smarter world. The challenge is for IBM to build a high-performance, data-class server that operates in a carbon neutral footprint. Also, make the process for building it eco-safe and carbon neutral, and the parts should be recyclable. And it should be built in the United States.
Marketing strategy, future consumer demands
We’re on the brink of a new society. We need to rethink things. Thanks to computing and mobile technology we have empowered people. We have digital natives who are coming out into the adult world. They have different expectations and mindset. They expected to be connected. That will act as a change catalyst and set new rules, and help define our next grand challenge.
“The grand challenge is we can use the smarter and network society not to play chess or jeopardy, but address climate change, aging population, scarce resources, and poverty. These are the global challenges.”
“We have to do it on a global scale and we have to have system were we include everyone.”
His startup provides information for tracking the pharmaceutical industry supply chain.
He suggests that IBM develop a system for feeding back consumer input to the producers of products and services. It’s “reverse logistics.
You could empower people to use their cell phones to send feedback. Leverage crowdsourcing. You add an electronic record to cash-based transactions.
Moderator: Irving Wladawsky-Berger
Chairman Emeritus, IBM Academy of Technology
We are living in an increasingly connected, complex and unpredictable world. If you need to find out what’s going on and made decisions, how do you do it in this kind of world? You can take a more statistical approach.
The grand challenge is: is there something IBM can do to formalize the transition from the traditional management world to one where people use these sophisticated new tools to improve the way they make decisions.
10:20 a.m. William LaFontaine, vice president, strategy for IBM Research, explains that today’s colloquium is part of a series of centennial colloquia aimed at convening scientist and business leaders around the world to help define the future of computing. By the time the series concludes next month, 1500 people will have attended events at 10 IBM Research labs.
10:30 a.m. John Kelly, director of research, says this program isn’t about IBM telling the world what the future of computing will be. It’s an invitation to a conversation. IBM wants to hear from scientists, business leaders and others—so they’ll help shape our agenda.
The information technology industry is driving exponential curves, he says. Exponential growth in our industry has been achieved by continual improvement and disruptive innovation. If you tune this right in our industry it can result in tremendous success. “Exponential curves will either put you ahead of the competition or kill you. It’s one of the other.”
He focuses on four areas of improvement that he believes will drive computing:
Nanotechnology: Billion transistor chips will be trillion device chips within the next decade. How do you get three orders of magnitude. We have to transition from the silicon device, which has only about another decade. “We’ll have to shift from silicon to other carbon based materials.”
In our labs we’re working in nano-materials at the same scale as biological molecules. We have a nano-material which has the ability to penetrate the cell wall of a staph infection, and kill it.
Exascale computing: We need substantial improvement in power consumption per device to get there. In core processors, memory architectures. We need to leverage photonics, light versus electrons. And we need to stack them.
A key partner for IBM in this area has been the US Dept. of Energy, working with the national labs. We’re building some of the largest systems in the world. “It represents the kind of partnership we can have between government, business and academia to drive progress in the United States.”
Big data: We’re not only getting more data, including from sources like social networks, but also data is coming at us faster. “We’re awash in data but we’re having difficulty extracting information from it.”
IBM’s Watson is a big data machine. It had to understand the Jeopardy questions and come up with the answer in less than 3 seconds. All of that had to be done in memory. There wasn’t time to go to disk. So this is a big challenge for the way we build analytical systems.
Systems that learn: We need to move from programmable systems, calculators, to systems that understand natural language and learn. It took 85,000 watts of power to beat two guys with 20 watts of power each in their heads. “Biologically oriented systems is something we’re very interested in. We have to move to things that move to neurons and synapses.”
10:45 a.m. John Kelly, director of research, says we beginning to be able to simulate systems that will be able to learn.
You look back at the beginning of systems. They were tabulating machines. They counted. The second generation was programmable systems. We’re on the verge of a third enormous step. These are systems that will be capable of dealing what the large volumes of information, which will be vastly more powerful than the systems we have today, and will be made out of new materials. “It’s an entirely new era.”
Here’s John Kelly’s IBM Centennial lecture, Pioneering the Science of Information, which he delivered at the University of Melbourne on Oct. 13.
11: 15 a.m. David Ferrucci, an IBM Fellow and the company’s “rock star” by dint of his leadership of the Watson grand challenge, ways Watson is only scratching the surface of what’s possible.
He talks about artificial intelligence, where it has been and where it’s going.
”In the beginning, there a lot of science fiction around it. We were underestimating the difficulties.” But now it’s more mature, more possible. Also there’s more demand. We have more and more complex systems we need to understand and we won’t be able to understand them unless we have much more powerful computing systems–cognitive systems.
Today, we’re recreating large amounts of data, and complex systems. We have to make systems smarter at analyzing large amounts of data and find meaning in it. But it’s difficult because much of the information is made by humans for humans.
“We don’t expect machines to originate meaning. We need them to detect meaning and help us make decisions.”
11: 30 a.m. David Ferrucci, an IBM Fellow, talks about the future of the Watson technology beyond Jeopardy!
When we apply the Watson technology to the real world, it’s not just about getting an answer. We need to deliver meaningful evidence behind the answers so people can make better decisions.
We need systems that can accomplish end-to-end tasks. They’ll combine learning, understanding and interacting with humans, in a multimodal world—where they’re able to assess not just text but visual imagery and sounds.
In the health care space, for instance, Watson can get the evidence but it can also see what’s not present. If a symptom is not present, it can be as meaningful as when a symptom is present. That could be vital in helping a doctor make a diagnosis.
He calls DNA the original software. DNA code is information. It is the program. If we can understand our biology we can do a lot more about improving healthcare.
“It blows me away, and I love to be blown away!”
Here’s a video from IBM that explains how Watson works.
12:20 p.m. Matthew Tirrell, director, Institute for Molecular Engineering, U. of Chicago, talks about bio-inspired nanoelectronics and systems.
The institute focuses on incorporating synthetic building blocks to create functional systems. “Science is about discovery. Engineering is about design and invention. Engineering is the path from science to society.”
There are several key areas where progress is being made.
Synthetic biology: It’s the modern age of genetic engineering. We’re talking about using synthetic biology to make things or to do things—creating organisms that are useful in healthcare, for instance. An example is the development of synthetic artemisinin, used for treating malaria, for a fraction of the cost of extracting the molecules from plants. We need to make this repeatable, systematized—so it can be done like manufacturing. We need to assemble functional systems from off-the-shelf parts.
My quip: Eli Whitney meets Watson and Crick.
Recapitulating and regenerating biology: People are creating and installing artificial cells, but he’s talking about reconstituting the machinery of biology, for instance the functionality of synapses in synthetic systems.
He’s particularly interested in recreating the structure of proteins. You build molecular recognition into them—sort of an information technology. You can create diagnostic peptides. You can use these to detect pathologies within the body.
12:45 p.m. Matthew Tirrell, director, Institute for Molecular Engineering, U. of Chicago, continues to talk about advances in bio-inspired nanoelectronics.
Understanding noise in biology: Noise comes from a lot of sources; the processes of DNA making RNA making proteins, for example. We’ll be making synthetic systems that don’t necessarily have deterministic characteristics, just like biological systems, so it’s important to understand noise in biology.
Mapping the connectome: We’re beginning to map the connections in the human brain. NIH is working on it. They’re starting with mapping the mouse brain. The human brain will take it up many orders of magnitude. We need to understand structure and function. So we need new devices that allow us to visualize and measure images much better than we can now. This mapping project requires the convergence of nanoscience and neuroscience.
In closing, he says information in biology comes from observations and measurements. We have to measure molecules, the interactions between them and the networks that connect them. “Measurement is more than detection. We have to create meaning out of numbers.”
The Context: Tirrell talks about the aims of the Institute for Molecular Engineering. One goal is to make small particles that circulate in the human body, look for evidence of diseases, report back and, potentially, treat heart disease or cancer.
2:15 p.m.. Steven Koonin, undersecretary for science, US Dept. of Energy, talks about advanced computing systems—exascale and beyond.
We need more computing power to deal with the problems we’re taking on—helping us see inside big complex systems. So we need to be able to do high performance simulation.
But he says sheer processing power is not enough. “The machine is just the golf cart. The swing also matters.”
The energy department is a great pioneer of simulation, because of its work to replace underground nuclear weapons testing with simulations, and now it’s moving that knowledge out into industry.
Uses for simulation in the energy sphere: optimizing design, optimizing operations and scaling new systems more rapidly.
He talks about US energy challenges.
Energy security: It’s almost entirely about oil, since the US if dependent on the world, so prices are volatile and pretty high.
US Competitiveness: We’re lagging. Litium ion technology was invented and refined in the US but we’re now manufacturing only about 1% of the batteries. Ditto photo-voltaics, wind turbines, energy efficient autos.
Environmental impacts: We have issues with carbon dioxide and its effects on the planet; and water, since it’s so closely related to the electrical grid.
We’re inefficient in using energy—only about 40% at best. “We have to get better.” We have very little information about how energy is actually used, and if we’re going to do better we have to know more.”
2:15 p.m.. Steven Koonin, undersecretary for science, US Dept. of Energy, talks about how to address the US energy problems.
He has worked up six strategies. Each involves a mix of technology, economics and regulation. “You have to make them all work together.”
(Sorry, I’m sitting in the back of the room and the type on his slide is too small for me to see. I’m sure you can find the strategies on the DOE web site, energy.gov)
Here are some opportunities for technology:
Vehicles: We need to make engines more efficient, and we have to shift as rapidly as we can to full electric vehicles. One interesting technology opportunity: You can use high performance computing to model enzymatic biomass conversion.
Buildings: 40% of US energy use is in buildings. “Buildings are a system.” We need to use computing to predict the properties of building systems and to improve operations in real time.
The electric grid: It’s such a complex system, both technologically and in terms of the numbers of organizations and regulatory bodies that are involved. Because you can’t store energy easily, you have to gain by managing the sources of energy. You can model the system and come up with better ways of turning on and off sources of energy—in as little as 10 minute intervals.
But all of this requires much more computing power. We need exascale computing–one thousand petaflops. We’re on a trajectory to reach this level by 2020. But there are challenges in getting there—power and reliability, for instance.
The Obama administration has asked for $126 million in the FY 2012 budget for developing exascale computing capabilities, so it’s possible that the there’s money available. He expects Congress to support it, because of concerns about national competitiveness. He hopes that the US will achieve exascale computing within 10 years.
Koonin talks about science, energy and the environment in an event sponsored by the American Academy of Arts and Sciences.
3:15 p.m. Lori Beer, EVP of Enterprise Business Services at WellPoint, which is working with IBM to bring the Watson technology to bear in healthcare, says, “We think the Watson technology could be a huge game changer in the industry.”
Healthcare costs in the US are 17% of GDP and rising. It’s unsustainable.
WellPoint wants to help address the issues by using the vast amount of data that it gathers to help improve the quality and efficiency of health care.
They have data on 34 million members, and on more than 100 million former members. They have 5.4 million providers. “Lots of big data. It’s a big challenge. But it also creates big opportunities.”
“How do you take big data and turn it into actionable insights?”
With Watson, one of the initial goals is to use the technology to help WellPoint’s nurses with pre-authorization, via a utilization management assistant; and to help physicians come up with the best diagnoses and most effective treatments, via an evidence-based evaluation system.
Also, they’re building out a longitudinal patient record, that gets matched up against the Watson knowledge base. The insights are provided to the internal nurses and the external physicians.
A key element is training Watson so the information the machine provides is accurate and up to date. Step 1 is Watson in medical school, digesting the medical literature and WellPoints treatment guidelines. Step 2 is its residency—they’re feeding in real case records and known outcomes. Step 3 is Dr. Watson. As they partner with physicians and understand treatment plans, they continuously feed in the data and use it to improve treatment of that patient and others with similar health problems.
But she vows: “There will always be a physician making the final decision.”
Here’s an LA Times article about how WellPoint plans on using IBM’s Watson technology.
4:00 p.m. Neil Gershenfeld, director of The Center for Bits and Atoms at MIT, says there are cracks in the fiction that software lives in a virtual world and hardware lives in the physical world. He blames computing pioneers Alan Turing and John von Neumann, and their early concepts for computing, for the limitations on today’s systems.
“How do we tap the rest of the universe?”
(I can’t follow much of what he’s saying. He talks fast and at a high level. He’s like a jazz musician riffing through a bunch of melodies and rhythms. Or, it’s sometimes like listening to chirping birds. My problem; not his.)
“Computer science is one of the worst things that ever happened to computers or science. It segregated things. For me, it’s one thing.”
A big question he contemplates: “If we’re going to fill the world with computing, how do we program it?”
He coined the phrase “Internet of Things.” He and colleagues developed ways to do internetworking through physics.
He says the Internet of Things and the Smart Grid have come to mean the opposite of what they should mean.
4:45 p.m. Neil Gershenfeld, director of The Center for Bits and Atoms at MIT, says “In the future, we’ll buy computers by the pound.”
He’s designing robots that work like proteins–based on how the ribosome works.
He talks about “digitized fabrication.” Society is evolving from computers connected to machines; to machines that make machines…… to the Star Trek replicator.” You can make anything you want using just atoms.
Instead of sending products around the world you can send the information to make them.
His class, “How to Make Anything,” evolved into the notion of Fab Labs, places where people build what they want—what they can’t buy at WalMart.
They have about 100 Fab Labs worldwide. They’re like community centers with computers and power tools. They spread virally.
“There’s one of these labs in Barcelona picking up where Gaudi (The Catalan architect) left off.”
“MIT is like a mainframe to process people. We set up something like an Internet of innovation.”
They’re creating an educational degree using this network of labs as the learning places.
“Big factories don’t go away. They make boring things, like nuts and bolts. You make the things you really want.”
“We can make computation that’s aligned with nature. The computing can leave the box and come out where we live.”
Here’s a lecture by Gershenfeld about his Fab Labs, which are used by people in developing nations to design things they need.