Instrumented Interconnecteds Intelligent

A screen shot from the Secret Squirrel project

A screen shot from the Secret Squirrel project

By Jeffrey Coveyduc and Emily McManus

Imagine being able to ask a panel of TED speakers: Will having more money make me happy? Will new innovations give me a longer life? A new technology from IBM Watson is set to help people explore the ideas inside TED Talks videos by asking the questions that matter to them, in natural language.

Users will be able to search the entire TED Talks library by asking questions. Then they’ll be offered segments from a variety of videos where their concepts are discussed.  Below each clip is a timeline that shows more concepts that Watson found within the talk, so that users can “tunnel sideways” to view material that’s contextually related, allowing a kind of serendipitous exploration.

Today, IBM and TED are showing a demo of the technology at World of Watson, an IBM symposium in Brooklyn, New York, aimed at expanding the role of cognitive computing in society.

Jeffrey Coveyduc, director, IBM Watson

Jeffrey Coveyduc, director, IBM Watson

Emily McManus, editor, TED.com

Emily McManus, editor, TED.com

The new project came out of a brief meeting at a TED event last winter, and was developed by an ad hoc team in about eight weeks. It’s a great example of how quickly developers can build new applications on the open Watson platform.

Here, IBM project leads Jeffrey Coveyduc and TED.com editor Emily McManus talk about how a simple idea has turned into a prototype of a new way to find insights in videos.

Emily: During a TED Institute conference in Berlin, I was watching IBM Research’s Dario Gil  talk about Watson and cognitive computing, and I loved his vision of how computers and humans could work together to answer hard questions using natural language queries. I started thinking: what kinds of new questions could we answer from TED Talks if we fed them into Watson? We have a set of almost 2,000 videos covering a broad swath of human knowledge, and sometimes it’s hard to know what’s in there. So, backstage after Dario’s talk, I asked him: What could Watson do with our videos and transcripts?

Jeff: I remember when Dario came excitedly to me to discuss what we could do with this data. As fans of TED Talks, we both knew there was a tremendous amount of compelling information and insight contained in those videos. But it wasn’t just the content — it’s the fact that TED’s mission is about spreading ideas, and here at IBM we’re using Watson to help extract knowledge that’s relevant to you and was previously difficult to access. Right away it seemed like a great match.

Emily: TED Talks has an API to help developers access videos and transcripts, but to speed up the delivery, since we were both in New York City, we threw all 1,900 videos, transcripts and metadata on a hard drive. It felt very Mission: Impossible. I told the team at IBM “I’ll be at the front desk and hand you the package as if we’re both spies.”

Jeff: We referred to the hard drive as “the football,” and kept the project under wraps, nicknaming it “Secret Squirrel.” We went into the effort without an ending in mind, not knowing where the adventure would take us. This was important, because the team — about a dozen experts in natural language processing, visualization, video analysis, speech recognition, along with media artists — needed the freedom to be creative, to experiment, to make mistakes as they explored the art of the possible.

Secret Squirrel team members at work

Secret Squirrel team members at work

The team used a bunch of Watson services from our developer cloud, like Concept Insights, Personality Insights, AlchemyAPI’s Concept Tagging and Entity Identification, along with some of Watson’s core capabilities like natural language processing, and its ability to understand context in questions.

At the end of our first flurry of activity, Watson had assembled all of the TED videos and the ideas within them into a visualization of the TED universe. Within this universe, ideas began to group themselves into clusters of related concepts — and we found that the clusters themselves offered us new insights into the TED Talks as a whole. For example, Music talks are located close to those focused on Time and Mind.

Emily: Four weeks ago, I got a call from Dario. He said he had something exciting to tell my team and could we come over the very next day.

He demonstrated Secret Squirrel and how Watson had in essence indexed all of our TED talks and was now able to show an amazing universe of concepts extracted from the TED videos, including some new topics that were outside our current metadata. Who knew we had so many talks that mentioned World War II?

Then they showed us how a user could ask Watson a question in natural language and get a string of short clips from many talks — building a nuanced and multidimensional answer to a big question. We know this is how our users want to talk about talks–when someone’s trying to remember a video they once saw, they don’t remember keywords, they remember ideas.

A screen shot from the Secret Squirrel project

A screen shot from the Secret Squirrel project

Jeff: Applying Watson in this way has shown us how we can make it so much easier to explore all the diverse expertise and viewpoints in video content. For example, think of all the TV journalism and university MOOC content that is produced.
On the TED site today, visitors can search TED Talks by keywords and other metadata. Now, together, we’ve shown that by adding Watson, it’s possible to go beyond keywords, and instead extract the essence of each talk, identifying the ideas and concepts within it. Doing so will enable people to discover information they care about but would have otherwise been very difficult to find on their own.

This project is focused on TED videos, but the broader idea of video analysis is a big deal because more than 95% of the world’s digital material is video. Unlike traditional text search, it’s really difficult and time consuming to find particular slices of information in video content.

We know that some of the great advances produced by humans come at the intersection of disciplines and through the collision of ideas. We think this project is helping show that the vast storehouse of video that society is producing, once it’s unlocked, could become a shared source of creativity and innovation.

If you want to learn more about the new era of computing, read Smart Machines: IBM’s Watson and the Era of Cognitive Computing.

 

Bookmark and Share

Previous post

Next post

14 Comments
 
June 13, 2016
7:30 am

Great stuff


Posted by: Alice Ngigi
 
June 13, 2016
7:28 am

This sounds so great!!


Posted by: Alice Ngigi
 
June 13, 2016
7:27 am

This is an amazing article.


Posted by: Alice Ngigi
 
June 13, 2016
7:27 am

Good job. Keep it up.


Posted by: Alice Ngigi
 
June 11, 2016
12:43 pm

thx full infonya


Posted by: mobil honda semarang
 
March 7, 2016
8:20 pm
March 7, 2016
8:13 pm

love it
snapchat pc


Posted by: snapchat pc
 
March 5, 2016
9:39 am

nice


Posted by: imo for pc
 
June 22, 2015
6:47 pm

Deep down inside, womn always blame themselves,
even though when a man cheats it’s not the woman’s fault.
To interview infidelityy exprt Ruth Houston, or have her speak at
your neext event, call 718 592-6029 or e-mail Infidelity – Expert@gmail.
Once upon a time, board games were the onoy form of entertainment.


Posted by: throne rush house of brotherhood
 
June 17, 2015
6:45 am

Great idea TED has and will be able to bring people together and technological advancements to the world.


Posted by: festus
 
June 13, 2015
2:35 pm

I really impressed with your article, and thank it’s isnpired me


Posted by: Paket Wisata Bangkok
 
June 10, 2015
5:12 am

I have never understood this TED show


Posted by: agrieconomics
 
June 9, 2015
1:53 am

Nice comment….”Users will be able to search the entire TED Talks library by asking questions. Then they’ll be offered segments from a variety of videos where their concepts are discussed. Below each clip is a timeline that shows more concepts that Watson found within the talk, so that users can “tunnel sideways” to view material that’s contextually related, allowing a kind of serendipitous exploration.
Today, IBM and TED are showing a demo of the technology at World of Watson, an IBM symposium in Brooklyn, New York, aimed at expanding the role of cognitive computing in society”


Posted by: Dennis
 
May 6, 2015
5:26 pm

Congratulations for the project Jeffrey! Loved the idea !


Posted by: Ramses
 
Post a Comment