Big Data

A Critical Role for Hardware in the Era of Cognitive Business

Share this post:

In the last few months, I’ve witnessed the beginning of a sea change in the way people at the forefront of computer science think about the future of our field. Faculty members at universities are showing a keen interest in cognitive systems. And I’m not talking just about algorithms and software. They want to discuss the processor and system technologies that will support a new generation of applications and a new era of computing.

In my view, this is a key step toward taking cognitive technologies mainstream—a shift I expect to accelerate this year. Academia is a cauldron of experimentation on the leading edge of science and technology. Think about how open source software took hold. Students embraced it and carried it out into the world when they graduated. The same thing will happen now with cognitive technologies.

Computer systems technologies will be critical in this new era. Back when the world shifted from the horse and buggy to the automobile, paved roads were needed to enable people to enjoy the full benefits of the internal combustion engine. So roads were paved and, eventually, the highway was invented. The same will be true in today’s transition from conventional computing to cognitive computing. We need new infrastructures designed for big data and smart machines.

Let’s start with microprocessors. They have driven much of the progress in computing since the 1970s—thanks to the tech industry’s ability to fulfill the promise of Moore’s Law by packing ever more transistors on fingernail-size chips. In recent years, advances in low-power and reduced-instruction-set processors have enabled individual systems and networks of servers to take on increasingly data-heavy tasks.

IBM’s latest Power processors were designed specifically for big data, and they’re well suited for cognitive computing because of their ability to deal with large volumes of unstructured data. They perform 2.5x better than conventional processors when attacking the most demanding computing jobs. That’s primarily because of their superior throughput for transferring data back and forth between memory and the processor, and because of their ability to execute more computing processes concurrently.

Many of the changes that are coming to computing systems are embodied in the next generation of supercomputers for the U.S. Department of Energy. Under the CORAL project, Oak Ridge National Lab’s “Summit” and Lawrence Livermore National Lab’s “Sierra,” will run on IBM’s Power processors and are expected to perform five to seven times faster than the top supercomputers in the United States today. The computers are expected to be delivered in 2017.

When the DoE announced these two CORAL computers, its leaders embraced a new approach to designing computers and data centers that IBM engineers had been advocating for several years—data-centric computing. The idea is based on the recognition that in the era of big data and cognitive systems, it’s too costly in time and money to move all the data that has to be processed to central processing units. Instead, with data-centric computing, processing is spread throughout the system data and storage hierarchy.

In CORAL, the DoE supported another major IBM initiative, which, I believe will also emerge as an essential piece of the infrastructure that supports cognitive computing. That’s the Open Power ecosystem. IBM opened up technology surrounding Power, including processor specifications and firmware, to enable other tech companies to design and build servers and components based on a common architecture. For the CORAL machines, we’re collaborating with NVIDIA and Mellanox, two Open Power participants, to incorporate NVIDIA’s accelerators and Mellanox’s speedy data transfer technologies.

Accelerators are emerging as elements of the computing systems of the future. GPUs such as NVIDIA’s have their roots in gaming on PCs, but have grown up to become key server technologies. They’re being used today to support compute-intensive machine learning applications. Another kind of accelerator, Field Programmable Gate Arrays, or FPGAs, can be reprogrammed after they’re installed in systems. That makes them a good match for improving the performance and energy efficiency of a wide range of data-intensive applications.

Back to CORAL again. The DoE computers will also adopt a new approach called software-defined storage. These technologies enable computing systems to use a combination of disk, flash and in-memory storage most efficiently for data-intensive computing tasks and to tap into storage resources at multiple data centers on the fly when more capacity is needed.

These same technologies, available as Power Systems and Spectrum storage offerings, are driving the next generation of enterprise IT infrastructure. They’re increasing the processing performance of Watson cognitive services by 10x, accelerating genomics applications by 25x, and reducing infrastructure costs for NoSQL stores for unstructured data by 3x.

The current generation of z systems have also been designed to bring cognitive to core enterprise data. Circuit designers, working on the processor at the heart of the z13, dramatically improved the data-crunching ability of the mainframe; making it an excellent platform for Cognitive workloads. Enterprise developers can leverage Apache Spark, Hadoop and cognitive services to integrate insights from new sources of data with insights from core enterprise data without impacting transaction speeds, moving data around or increasing data security risks.

So you can see how today’s system technologies, which were designed with big data and Internet-scale data centers in mind, are being adapted to deal with the even greater demands of cognitive computing. I expect advances in processors, system design, accelerators and storage to come in waves over the next few years.

In addition, the tech industry will need to invent radically new architectures for cognitive computing. One promising area is processors to support neural networks—which are specifically designed to extract patterns from unstructured data extremely quickly and. Another is quantum computing. Scientists at IBM Research, other tech companies, and within academia are producing advances in the field that could result in functioning quantum computers within a decade that process data exponentially faster than today’s machines.

This is an incredibly exciting time in the computer industry. And for a guy, me, who got his start in chips and chip manufacturing and them moved into systems, it’s a great time to be in the hardware end of the business. I can’t wait to see—and play a role in—what happens next.

More stories

Accelerating Digital Transformation with DataOps

Across an array of use cases, AI pioneers are employing a core set of new AI capabilities to unlock the value of data in new ways. According to the 2019 IBM Global C-suite study, leaders are using data 154% more to identify unmet customer needs, enter new markets, and develop new business models. These leaders […]

Continue reading

AI in 2020: From Experimentation to Adoption

AI has captured the imagination and attention of people globally. But in the business world, the rate of adoption of artificial intelligence has lagged behind the level of interest through 2019. Even though we hear that most business leaders believe AI provides a competitive advantage, up until recently, some industry watchers have pegged enterprise adoption […]

Continue reading

In Telecom, Watson Assistant Grows 150% Year-Over-Year

Nobody likes the prospect of having to get on a computer or pick up a phone and reach out to customer service. Call centers in countries around the world are notorious for long wait times, poor service and high customer churn. In many call centers, even the best employees are forced to dig for answers […]

Continue reading