This has happened to you. You’re motoring down the highway when you hear a bulletin from one of those all-news-all-the-time radio stations telling you there’s trouble ahead. A tractor trailer has flipped and is blocking two lanes. So you hop off at the next exit and …… come to a grinding halt in gridlock traffic. Everybody else has the same idea you did. When you finally arrive at your destination, late, a colleague tells you they made it on time using the highway. The accident had been cleared. %@#+*!
Timely traffic information. It’s a promise that is often made but rarely fulfilled. The reason: Most systems for monitoring traffic and alerting people about problems have latency issues–maybe as much as 20 minutes. Even the traffic information services on iPhone and other GPS-enabled devices isn’t always up to date.
A big idea that IBM scientist Nagui Halim had back in 2003 is about to finally make traffic information truly an up-to-the-minute phenomenon. More about Halim in a minute. First, today’s news:
Scientists from IBM and KTH Royal Institute of Technology in Sweden are collaborating to bring real-time traffic info to Stockholm–which likely will make it the first city in the world to possess such a capability. Over the past year, IBM has been working with the city to monitor traffic flow during peak hours. The congestion management system has reduced traffic by 20 percent and reduced average travel times by almost 50 percent. Now we’re putting some of our newest analytics technology, called InfoSphere Streams, to work there, too. The plan is to gather information germane to traffic congestion from a wide variety of sources, including sensors in taxi cabs and delivery trucks, on-time performance updates from transit systems, and weather information–then making it readily available to travelers so they can make the best decisions about driving routes, travel times, and transit alternatives. “This is the first application of real-time analytics to traffic,” says Halim.
Picture this: A resident could send a text message to the traffic monitoring system listing their location and destination. The system would instantly spit back a recommendation.
Back to Halim. He was working at IBM Research back in 2003 when he saw the need for technology that could monitor multiple streams of data, real-time, and then mash it up to create actionable knowledge. At the time, most so-called real-time systems weren’t real time at all–or they were highly specialized systems. He saw the opportunity to create an approach that could be applied to any number of purposes.
It took a while. There were glitches and dead ends. Some of Halim’s colleagues thought he was crazy. But now its here. IBM last year began working with clients to build applications for the technology in health care, financial services, telecommunications, manufacturing, water management, radio astronomy, and particle physics. In February, we formally launched InfoSphere Streams as a product–in a new version with substantially improved performance.
Here’s how it works: Data comes into the computing system from the network. The system can handle thousands of streams of information concurrently. It breaks the flow into a series of small steps, recognizing the kind of data that’s coming in and quickly sending each chunk to a microprocessor that best able to deal with it. Then, through a method called “sensor fusion,” the system weaves the strands of processed data into usable information. “It’s all about gathering and making sense,” says Halim.
Depending on what you want to do, you can run a stream computing application on a supercomputer, a blade server, or even a laptop. You can analyze something relatively simple like a flow of Twitter Tweets on your laptop.
How big could stream computing be? Halim, who is now director of the InfoSphere Streams product group in IBM software, won’t put a big number on it. But he points out that there are potentially game-changing uses for the technology in one industry after another.
By the way, that IBM TV commercial you’ve seen of a baby blanketed in colorful strands representing the data from monitoring its vital signs? That’s stream computing. But that’s another story.