Instrumented Interconnecteds Intelligent

Colin Parris, GM IBM Power Systems

By Colin Parris

Westside Produce, a harvester and distributor of fresh melons in California’s Central Valley, probably isn’t the kind of company that comes to mind when you think about cutting-edge computing technologies. Yet this outfit, with just a few hundred employees, uses sophisticated technology to predict how many melons will be ready for harvest on any given day and to trace the movement of its produce—down to the case level—all the way from the field to grocery shelves.

Westside Produce is emblematic of a major shift that’s coming—a new era of computing that will deliver the power of big data analytics to organizations of all sizes and to all sorts of people within them.

You remember Watson, the IBM computer that beat two former grand-champions at the TV quiz show Jeopardy. That kind of data-crunching power is coming to the masses.

YouTube Preview Image

The combination of massive amounts of information and the tools to make sense of it has huge implications for businesses and society. Today, computers are everywhere—thanks, in large part, to the revolution in communications that has brought us all manner of smart phones and digital tablets. Now, data analytics is on its way to becoming pervasive, as well.

Big Data isn’t the sole province of big companies. Organizations of all sizes are challenged to make sense of huge amounts of data from mobile devices, video cameras, sensors and social networks. A medium-sized fashion retailer in South Africa needs access to big data insights just as much as a giant rail freight hauler based in the United States.

What will make pervasive analytics possible? Two things: a better way of designing computers combined with a new way of delivering insights to people wherever they may be.

First, a change is coming in the way engineers think about computer design. It’s called data-centric computing. Up until now, most computers have been engineered with the idea that the microprocessor is at the center of things. All of the data is shipped to a central processing unit. But in the world of big data, moving oceans of data around costs a lot of money and burns up a lot of energy. We think it will be better, in many situations, to do some of the data processing closer to where the information is stored.

At the same time, the way we use computer memory will change. Memory chips, where data is cached temporarily during the computation process, should be more tightly integrated with the processors. In that way, the data won’t have to be moved as much and there won’t be much delay in fetching information that’s needed quickly.

Watson, which is a jazzed-up version of one of IBM’s commercial server models, provides a peek at how the data-centric computers of the future will be designed. On Jeopardy, the machine was able to search a vast database containing millions of pages of information and come up with answers to questions in about three seconds. That’s partly because large-capacity memory chips were positioned right next to the processors within the individual server computers that made up the Watson system.

YouTube Preview Image

Today, most computers are not engineered to be data centric. But as the big data phenomenon takes root, we believe this will become the standard approach to computer design. When that happens, data-centric computing will become more affordable to businesses of all sizes. (In fact, in a step along the path, IBM today launched a new generation of servers that makes high-performance analytics more affordable for small companies like Westside Produce.)

The second crucial element for making data analytics pervasive is the ability to deliver information and insights wherever people need it. We believe that the method of choice for doing so will be cloud computing—the practice of using a network of remote servers on the Internet to store, manage and process data, rather than using a nearby server. The benefits of cloud computing are being recognized by more organizations every day because it makes data analytics and other capabilities easy to buy and use.

One last thing: making things easy to use will be critical in the new era of computing. Today, many computers are too difficult to set up and to use. For data analytics to reach the masses, engineers will have rethink their assumptions and begin designing for people who are not computer professionals or experts at analytics.

There’s much to be done by the computer industry, but I’m confident that before too long, analytics will be at everybody’s fingertips. Individuals and organizations will have the insights they need, when they need them. What’s happening is profoundly important. We’re on a journey from the information age to the insight age.

Bookmark and Share

Previous post

Next post

Post a Comment