By Mike Rhodin
The second in a series on the IBM Watson Platform, this blog explores the future of how computers relate to us.
For decades, moviemakers and TV producers have featured talking computers as futuristic props—whether it was Captain Kirk barking commands on the Starship Enterprise, Michael Knight talking to his car, K.I.T.T., on Knight Rider, or Theodore cooing to his smartphone operating system, Samantha, in the recent movie Her.
Yet, even though the way we interact with computers has come a long way since the days of punch cards, in large part we are still forced to deal with them mainly on their terms—and hampered by their limitations.
Not much longer.
An essential part of the third era of computing—cognitive computing—will be our ability to interact with smart machines in ways that are more natural to us. Making them conversational is an important part of that effort.
I’m not talking about just giving the computer a simple command or asking a simple question. That’s yesterday’s technology. I’m talking about more realistic conversations—everything from friendly chitchat to intense debate. As cognitive conversation capabilities advance, we become more than impersonal button pushers.
In the cognitive era of computing, machines will better understand us and relate to us in more human ways.
At IBM, we’re working on a host of technologies – a set of conversational services – aimed at enriching the relationship between you and the system. These conversational services range from helping systems understand us as individuals to selecting the appropriate words and responses that are most meaningful to each of us.
These services are part of our Watson Platform, which will become a library for cognitive technologies. Platform components are grouped in four categories: Perceiving, Reasoning, Relating and Learning. These conversational services will be core to our Relating portfolio as they will fundamentally change they way we relate to technology and information.
We’ll integrate conversational services from a variety of sources. Some are the natural evolutions of our original Watson technology that understood language. Others are coming out of our advanced Research work that has become part of our Watson Group. And still others will come from great innovators in the industry as part of our ecosystem initiative and collaboration with entrepreneurs.
In addition to the conversational services we have been developing within IBM, we have also acquired the startup, Cognea, which offers virtual assistants that relate to people using a wide variety of personalities—from suit-and-tie formal to kid-next-door friendly. We believe this focus on creating depth of personality, when combined with an understanding of the users’ personalities will create a new level of interaction that is far beyond today’s “talking” smartphones. We welcome to IBM co-founders Liesl Capper and John Zakos, and the rest of the Cognea team.
And we’re not going to keep these advances to ourselves. We’re going to make Watson conversational services available to all of the members of our ecosystem—business partners, entrepreneurs, universities and enterprises. They’ll be able to tap into these services on the Watson Platform through the Watson Developer Cloud.
Using these services, the designers of cognitive applications will be able to select from a variety of capabilities to make the experience of interacting with computers more natural and valuable.
Ultimately, we plan to offer technologies that make it possible for you to carry on a highly intellectual debate with a computer. My colleague, John Kelly, who heads IBM Research, gave a taste of what is coming (Tune in at 45:30) in a presentation three weeks ago at the Milken Institute Global Conference.
The Cognea acquisition represents an extension of our effort to rapidly expand the Watson ecosystem. We have committed to investing $100 million in venture capital in startups that are building apps and services on top of the Watson Platform. We have already announced two investments, in Welltok, whose digital platform helps insurers guide consumers to achieve health goals and reward health behaviors; and Fluid, which is building a cognitive shopping assistant. Many more will come.
You can imagine a multitude of uses for conversational services. Smart machines will serve as virtual personal assistants, health coaches, companions for elderly people, investment advisors, tutors, travel agents, customer care agents and shopping advisors. Increasingly, you’ll converse with your tax preparation software at home, your accounting system at work, and your calendar when you’re on the road. In each scenario, they’ll converse with you in the ways that will be most effective—based on who you are and what you want to accomplish.
I think about all of the people I come into contact with in the course of my day-to-day life: my wife and kids, the kids’ teachers, my parents, my friends, my professional colleagues, the auto mechanic, doctors, strangers I meet at parties, etc. I adopt different ways of speaking in each situation–different vocabularies and styles. And so do they. It’s obvious that conversations between humans and computers will play out in the same way.
So, welcome to the era of cognitive computing. The future is here today.
If you want to learn more about the new era of computing, read Smart Machines: IBM’s Watson and the Era of Cognitive Computing.