By Jeffrey Welser
One of the watershed moments in the history of computing took place on Dec. 9, 1968. Douglas Engelbart and his team at Stanford Research Institute presented a technology demonstration that included the first public showings of the computer mouse, hypertext, dynamic file linking and shared-screen collaboration over a network. Those advances turned out to be essential building blocks for personal computing and Internet, and the event came to be called “The Mother of All Demos.”
While only history will say for sure, I think we saw the glimmer of a similar new beginning last week at IBM Research – Almaden, in Silicon Valley. The IBM Cognitive Systems Colloquium signaled a shift from a singular focus on the von Neumann computing architecture, which has dominated computer science and the computer industry since the mid-1940s, to new architectures modeled on the human brain. Continue Reading »
By Dr. John E. Kelly III
The microprocessor was one of the most important inventions of the 20th century. Those chips of silicon and copper have come to play such a vital role that they’re frequently referred to as the “brains” of the computer. Today’s computer designs put the processor at the center.
But the needs of businesses and society are changing rapidly, so the computer industry must respond with a new approach to computer design—which we at IBM call data-centric computing. In the future, much of the processing will move to where the data resides, whether that’s within a single computer, in a network or out on the cloud. Microprocessors will still be vitally important, but their work will be divided up.
This shift is necessary because of the explosion of big data. Every day, society generates an estimated 2.5 billion gigabytes of data—everything from corporate ledgers to individual health records to personal Tweets.
One of the most intriguing elements of the new era of cognitive computing is the development of brain-inspired technologies. Those are technologies that mimic the functioning of the neurons, axons and synapses in the mammal brain with the goal of interpreting the physical world and processing sensory data: sight, sound, touch and smell. Today’s IBM Research Cognitive Systems Colloquium at IBM Research – Almaden is focusing on this realm of the cognitive computing world. Please come back for frequent reports and updates, and join the conversation at #cognitivecomputing. Continue Reading »
One of the most intriguing research projects at the Almaden lab over the past decade has been the development of a neurosynaptic microchip modeled on the workings of the brain. Funded since 2008 by the U.S. Defense Advanced Research Projects Agency’s SyNAPSE initiative, a team at Almaden led by Dharmendra S. Modha created not only a radically new chip architecture but a new approach to creating software applications.
Tomorrow, their work begins the transition from a science research project to a technology that’s on its way into the commercial marketplace. Continue Reading »
By Charity Wayua, Ph.D.
I come from a family of educators. So when it came to choosing a career, it was natural for me to go into education. My vocation, though, is research. I study educational systems so that I can help re-imagine what they can be.
Few places can benefit as much from this kind of research than Africa, where I grew up and now work as a scientist at IBM’s new Research lab in Nairobi, Kenya. Africa is a paradox. It has seen tremendous growth during the past decade.
And yet half of the children in Africa will reach adolescence unable to read, write or do basic math. Two-thirds of those who don’t receive schooling are girls, because many of them have to stay home and take care of their younger brothers or sisters. Continue Reading »
The world is in the early stages of a major shift—from the programmable computing era to the era of cognitive systems. Today at IBM Research, we’re convening our second-annual Cognitive Systems Colloquium. We’ll be hearing from some of the smartest people in the tech industry. Please return throughout the day for frequent updates. And join the discussion at #CognitiveComputing.
9:10 Zach Lemnios, vice president research strategy and worldwide operations:
We’re here to bring together researchers, clients, students, young entrepreneurs. We want to highlight the work of the past year and look at the challenges before us, and help to build an ecosystem to drive innovations in cognitive computing. How do we scale up this enterprise—how do we create ways for people to use these systems in ways that are very easy to use.
By Tom Rosamilia
IBM has always taken the long view of its business strategy, continuously reinventing – from divesting its PC business to more recently its x86 business.
Today’s announcement that GLOBALFOUNDRIES plans to acquire IBM’s global commercial semiconductor technology business is one more step in the company’s reinvention. The Agreement reinforces IBM’s clear path, commitment and vision for systems and hardware.
IBM’s proven model for success is driven by focusing on the high-value segments of our systems portfolio driven by the unique innovation that only IBM can bring. GLOBALFOUNDRIES’ business model is to innovate through high-volume semiconductor manufacturing, which is enhanced by economies of scale.
If you’ve been following IBM’s hardware business closely, you’ve heard us talk about the need to continuously transform our business. OpenPOWER, Software-Defined Storage, Flash memory, connecting mobile and the mainframe and the sale of our x86 business to Lenovo are a few of the most recent examples. Continue Reading »
By Duncan Johnston-Watt
The revolutionary potential of cloud is a topic that’s much discussed today, with many drawing comparisons between the emergence of cloud and the advent of the Internet age.
And with good reason: there are striking similarities in the way both of these innovations are transforming the way organizations collaborate, communicate and create.
And much like the beginnings of the Internet Age, we see some companies taking the plunge, while others are adopting a more conservative approach. It should come as no surprise that the “born on the web” companies have been early adopters while enterprises have been somewhat more reserved in their exploitation of cloud. Continue Reading »
By Michael Rhodin
It’s hard to believe it’s only been 10 short months since the IBM Watson Group was announced. We talked of bringing together a unique group of people – incredibly talented professionals from across IBM – into a new unit.
This included the single largest movement of IBM Research personnel in our history, along with 10 – 12 startups worth of new cognitive technologies that would help define the Watson team. Individuals and core capabilities from our software business would join into the fray.
A new approach to engaging the market would be created from talent across IBM’s sales, marketing, services and consulting organizations. A new cloud delivery organization would be formed out of our services teams to serve this market – all brought together with a single purpose: to usher in a new era of computing. Continue Reading »
By Chris Sciacca
If you believe the press, you may think that passwords are antiquated. And who could blame you? With major breaches being reported at popular websites such as LinkedIn, Adobe, Yahoo!, and Twitter, passwords may sound like a vestige of past security solutions.
Well, not so fast. IBM scientists have developed a three-pronged approach that can secure all of your passwords for social media, email, cloud files or shopping websites, with one practically, hack-proof password.
And this password is secured by something they like to refer to as the “Memento Protocol.” In the 2000 film “Memento” by Christopher Nolan, the protagonist suffers from short-term memory loss. Throughout the film he meets several so-called friends, but due to his condition he never really knows if they are trustworthy or if they are trying to steal something from him. Continue Reading »