By Ying Li
In the same way ingredients connote flavors, colors and images can indicate moods and send messages.
This concept is driving new research here at IBM to better understand color relationships and their potential impact on everything from product design to classroom layouts.
Machines have been able to render different colors since the first color monitors. With a mix of code numbers for red, green, and blue, a computer knows that “0, 0, 0” equals black, that “255, 255, 0” is yellow, and so on. Other codes represent hue, saturation, and brightness of a color as well.
We are adding machine learning algorithms on top of these standard color representations to open up an entirely new way for a computer to see and understand the world. For example, a photo of a nature scene would mean more than combinations of green, such as “organic” or perhaps “serene.”
The meaning we humans associate between color and imagery has been studied for years. People usually feel one way on seeing an image of a grassy field, but completely different when presented with an image of a dollar bill, even though they’re both green.
Using Watson, we are building a system that learns about these associations by mining thousands of images cataloged in a database. We are also feeding it prior research on colors and their psychological effects, as well information from ongoing interviews we’re conducting. The system keeps getting better and better at knowing what colors are in the images it’s looking at, and what those colors mean in the context of the image.
We translated the system’s artistic and academic knowledge into design-message combinations such as “red represents confidence,” “yellow means innovation,” and “gray equates to futurism,” among many others. The system then applies the principles of color psychology, analyzes academic studies on computational aesthetics, and explores the interactions between different messages using natural language process techniques to produce a set of novel, visually appealing color palettes that communicate a chosen message.
Artist Stephen Holding used the systems to create the recent World of Watson mural. The resulting palette of color combinations were based on the meaning he wanted to convey in his design. As you can see in the photo above, he might have tuned “creativity” to its highest level.
Our next goal is to give the system the ability to offer design suggestions, too. We think that training the system to understand other visual elements such as shape, texture, form, space, depth, and line will expand creative and artistic capabilities.
A perfumery recently reached out to us, asking for help to redesign the bottle of a poor-selling cologne. It smelled good, but no one liked the packaging.
We’re also exploring with retailers ways to put this technology into the hands of shoppers. Think of it as a way to assist you in not just choosing styles, but the image you want to portray. From professional to casual, or even geographical region, the system can recommend garments and accessories with specific color combinations that match your message. That’s the kind of creative power of color we hope this technology unleashes.