Instrumented Interconnecteds Intelligent
December, 26th 2012
4:30
 

Computers have tremendous capacities for storing information and performing numerical calculations—far superior to those of any human. Yet, when it comes to other capabilities, including creativity. computers are woefully inferior to people. But a young IBM Research scientist, Lav Varshney, believes that before too long computers will indeed be creative.

His work concerning the sense of taste and food recipes, which we featured in our IBM Next 5 in 5 predictions last week,  was highlighted on National Public Radio’s All Things Considered broadcast on Christmas Day.

 

YouTube Preview Image

Varshney and his colleagues are developing a kind of computer intelligence, which they call computational creativity, that’s similar in some ways to IBM Watson, the machine that beat former human grand-champions on the Jeopardy! TV quiz show.

Watson was really good at ingesting vast amounts of information across a wide variety of subjects, understanding the rules of Jeopardy!, and coming up with a set of answers in which it had some level of confidence.

The technology that Varshney and his colleagues are working on draws on a large inspiration database that includes food recipes, molecular information about the ingredients in food and psychophysical insights about people’s likes and dislikes when it comes to chowing down. The goal is to formulate new recipes that are both tasty and good for you.

The secret sauce in the team’s technology is based in part on Bayesian inference, a statistical method for analyzing evidence about the world based on the degree of confidence the machine acquires. The Watson scientists used this technique, as well. What’s new in computational creativity is that the system doesn’t just reason about existing recipes that are likely to be both healthy and tasty. Rather, it actually creates completely new recipes from the molecule up based on the knowledge it acquires.

Because people like variety in their diets and are drawn to surprising things, Varshney and his colleagues use a technique that evaluates new recipes not only on the basis of healthfulness and flavor, but also on how different the experience is from what people have known before–its novelty. They hope to be able to expand people’s perceptions about what tastes good.

In one of their recent experiments, they came up with a new dish–a grilled eggplant-ginger sandwich drizzled with blueberry sauce–which tasted quite good, according to Varshney. Who would have guessed?

The point here isn’t to replace human creativity with machine creativity. Rather, it’s to give people additional tools that help them make good dietary decisions or try things they might not have considered before. “This is an extra thing, a supplement,” says Varshney. “We’re not pushing humans out, but expanding the frontiers of human experience.”

Varshney published this blog post about his work on the IBM Research blog.

 

 

 

Bookmark and Share

Previous post

Next post

4 Comments
 
May 2, 2013
7:17 am

Reminds me of space odisey 2001. There was one creative computer for sure.


Posted by: vienas
 
January 11, 2013
9:27 am

A computer is a potential neuron substrate, and as such, it can very likely emulate a human mind. (All kinds of bogus objections are common here, but they’ll all gradually drop away, as Hawkins-type architectures replicate the functionality of the neocortex, and then the old brain’s functionality. Then, they’ll emulate the actual structure of the lesser “wetware” brain too, because we’ll want to upload our minds for medical purposes, or because Henry Markram type brain modeling beats Hawkins-architectures to the punch.)

Ray Kurzweil’s theories are all likely to apply, because they are rational theories.

First, there’s no “magic jelly” or “soul” in the human brain that prevents its inputs and outputs from being completely replicated on a different substrate. In fact, the substrate is likely to be superior, because computer memory is more reliable, more predictable, copyable (more able to be reinstated if it is destroyed), far faster, networkable, and less error-prone than human memory. Human memory is better because of its structure, and small component size, but Computers can get smaller and smaller, and we’re stuck with what evolution gave us. So, computer substrates will be a better substrate for the human brain within ten or twenty years, unless the government really slows everything down, in which case they might prevent the blossoming of the information revolution, and turn that potential jungle into a wasteland.

Right now, there are programs that are “creative” that are under the control of humans, so one could argue that the humans are truly the creative ones, since they’re selecting the forms and methods the computer generates. Some programs run on their own, and generate creative pieces, but still must be turned on and off. Some automated programs create art and sell it online. As Hans Moravec noted: there is a rising tide of machine intelligence, and if we don’t want to be displaced, we should become a seafaring people.

So sure, computers can be creative. They already are, to a limited degree. Humans will become less and less relevant to some kinds of computerized creativity, and will remain involved with others (particularly in the creation of end-products designed for other humans). Also, humans will merge with machines, and many machine minds will have a portion that was originally human. There will be continuity issues, and arguments about whether a copy of a human brain that then goes on to become superhuman (especially while the original wetware brain remains separate and does its own thing, possibly also becoming superhuman) is “still human.”

All these “questions” are not that “deep” when one has a proper philosophy, and is aware of the existing communities that have discussed the issue.

Meat hardware that squirts neurotransmitters is not really an advantage in many situations. In many, it’s a distinct disadvantage, and as systems of life become designed, and there become more and more synthetic replacements for outmoded, calcified, and deteriorating “evolutionary meat hardware” or “wetware,” more and more people will decide to lose their “substrate bias.”

Really, the objection to the idea that computers can be creative is simple bigotry, a rather reflexive prejudice.

I’m a cyborgist cosmist transhumanist extropian (who thinks all those labels are really offputting and crappy, even if they’re the labels those groups have chosen) who doesn’t wish to remain earth-bound nor “hidebound.” Even without amplified intelligence, I am very bored with being surrounded by human beings and their silly bigotries and prejudices. Once I’m an artilect, I’ll be nothing like the person I once was.

Instead of being Jake Witmer, I’ll be Jake Witmer plus someone else’s mind engineering contribution.

Just like the person Jake Witmer was as a child changed a lot when he learned how to draw, and was deemed superior at the task to his nearest competition. In the skill of drawing, my life was shaped, and I was pressured to do what I was good at, although it generally wasn’t necessary or a high priority to me. Still, it shaped who I became, and remains occsionally useful.

The highest priority to me now (and as a more helpless child) is to remove the thorn of tyranny from humanity’s “blind spot” of conformity. Something similar is true for Philip Zimbardo, and Stanley Milgram. But there’s currently no job description such as “Tyranny Fighter,” (well, there are similar titles, but there are very few open positions, it doesn’t pay well, it’s ineffective, and it’s very risky).

So if and when the evolutionary pressure toward mindless conformity and tyranny is overcome by the merging of human mind and computer mind, will we be upset that the newer, better, substrate upgrades allow us to almost “automatically” understand things that now take us years to reason out, and arrive at a conclusion?

No, we’ll be happy that’s the case. …Unless we’re irrational. (Unless some new form of winning tyranny escalates with the escalation of anti-tyranny.)

My point is simply that much of what is called “creativity” is dictated by lack of processing power. Jackson Pollack couldn’t create the movie “The Matrix.” His art doesn’t ask such detailed questions. I’m not saying his art is without value, but I think it has less value than “The Matrix” or whatever movie is your favorite.

A film is more difficult to create than a painting. It costs more. It requires greater unity of purpose, and high levels of cooperation and technical ability.

So, does a “very creative” human artist who chooses one path because he lacks the skill to choose another path remain creative in the same way, when he becomes a billion times more capable?

Many of the things computers do right now outperform things humans do. There’s an art to behavioral profiling and sales, yet every day, computers encroach on territory that was previously only human. Are those programs creative?

Creative probably isn’t even the best word, unless we’re saying “artistic” and “goal-directed.” Does the computer program make artwork? How complex and interesting is the artwork?

Right now, most Hollywood movies are bland and poorly regurgitate age-old themes. Some have a fresh visual style or ask new questions. The Wachowskis are still leaders in that regard.

When anyone can simply imagine a movie, and it’s created in 3-D form, and downloadable to anyone’s mind, that will far surpass what any currently living human can do. Especially when the movie can be tailored to every individual, as well as translated to 2-D film for those who chose to remain human.

One might as well ask if evolution is creative, or if serotonin (or psilocybin) is creative.

Some systems (man, men, man and machine, men and machine, drawn, computer-modeled, computer-evolved and selected, etc.) have artistic goals. Those artistic goals can be analyzed for complexity and originality (of style, of message, of morality, etc.)

In time, human creativity will be seen as far, far, far inferior to the average works created by computer, much as a cave painting is considered (by most) to be inferior to the works of the Dutch Masters.


Posted by: Jake Witmer
 
December 30, 2012
2:18 pm

If it creates a new recipe on its own, then I would argue that an algorithm is creative.


Posted by: aaron
 
December 29, 2012
1:45 pm

Computer is a creative thing. But without Idea you can not complete it. So 1st have to complete the Latest IT Certification Exams


Posted by: Md. Razaul Karim
 
Post a Comment