Plan "Bee" — my personal quest to cross-pollinate smarter planet thinking inside and outside IBM — recently led me to reconnect with Gary Zamchick, below, a colleague from my early Internet days at Time Inc. New Media.
Gary had seen my post about Smarter Planet on the Time Inc. Alumni Group on LinkedIn. A technology savvy, pioneering designer of interactive services and physical spaces (with Rockwell Group and AT&T among others) Gary intuitively appreciated the implications of smarter planet, as well as the give and take approach of Plan Bee.
One characteristic of Smarter Planet is going to be more intelligent ways for us to see or visualize information and data. Gary and I agreed that visual thinking is so central to human nature and so universal that it is bound to inform not just the next iteration of the World Wide Web, but its evolution into the "Web Wide World," i.e. the embedding of all things digital into the physical world and human lifestyles.
One of Gary’s projects — WordsEye, a semantic innovation that enables people to create 3D scenes or pictures simply by entering text descriptions — spoke directly to this visual aspect of Smarter Planet. Imagine being able to build a 3D or virtual environment, or telling a story visually, just by describing it.
Natural language processing for graphical output is the kind of smarter planet I want to live on. If it can work with different languages it could even be the basis for a kind of visual translation service, or enable those who can’t physically speak to express themselves in entirely new ways.
Gary used the service to do something along these lines in this example, turning my shorthand description of Smarter Planet into a 3D image …that it is about the "3V" changes in data — greater volumes, varieties and velocity of data that will often require realtime processing power.
While Gary and his co-founders are looking for investors and partners to advance WordsEye, I offered to connect him to some of the amazing IBMers pioneering 3D, virtual and data visualization innovations.
One of them is Many Eyes, which the site describes as "a bet on the power of human visual intelligence to find patterns. Our goal is to ‘democratize’ visualization and to enable a new social kind of data analysis." (Jump right to our visualizations now, take a tour, or read on for a leisurely explanation of the
IBM and the New York Times recently started collaborating on Many Eyes via the Times’ Visualization Lab.
Enter the vConference
Our buzzing around visualization and collaboration led the discussion toward what might be consider the killer business app for 3D: a seamless and elegant virtual meeting solution. In today’s environment of global teams and endless teleconferences, what’s missing in that work experience is the value of face to face, highly human interaction.
Webcam-based video conferencing is going to be one way to humanize the disembodied (see Tokbox, ooVoo and the new Google videochat for Gtalk), attention-deficit disorder of audio-only conference calls, but the ultimately richer (and smarter) solution is going to be even more immersive … a one-click, highly automated solution for bringing teams together in a virtual meeting.
Not only would this offer globally dispersed knowledge workers most of the social and human benefit of F2F, and less opportunity to be distracted, but it would also enable real collaborative work to get done ala whiteboards and other tools. What’s more, because the environment is not just immersive, but also digital, everything can be captured and shared.
To this end, IBM has been discussing the idea of Sametime 3D, a way to transition from IBM’s instant messaging environment (Sametime is already arguably one the most pervasive collaborative tools IBMers use) to a virtual meeting. The key will be for the experience, IMHO, to be dead on simple. Natural and intuitive controls, automated generation of avatars and sophisticated voicechat.
Are You Experienced?
Our back and forth moved on to fronts such as Gary’s experience designing physical environments for generating innovation and creative interaction.
That reminded me of IBM’s Innovation Discovery program, which brings together diverse teams of experts and executives from IBM and a client for a "collaboratory", an intensive brainstorming session designed to solicit big ideas for new ways to work together. Of course, IBM also has briefing centers and industry labs around the world to facilitate the same kind of knowledge exchange that Gary and I were engaging in.
Lastly, all the talk about making physical spaces smarter and more embedded with digital sentience lead us to discuss the important, if awkwardly coined idea of "augmented reality," or AR. Basically, AR is about overlaying, projecting or embedding digital, 3D or Web-based content in the physical world, usually via GPS-based location awareness. It could be in the form of a heads-up display that makes driving directions appear to be actually on the road, rather than on the navigation screen in the dashboard.
Let’s Get Digiphysical
To get to this "everyware" world of ubiqutious computing and "digiphysical" convergence, one prerequisite will be a kind of universal broadband wireless. On that front, Gary and I agreed that something like the new "white spaces" spectrum recently approved by the FCC could be the beginning of a kind of pervasive "wifi on steriods" that a smarter planet is going to need. This TV channel spectrum, also know as the "vertical blanking interval," is made up of the frequencies between TV channel broadcast signals. Broadcasters had opposed the use of this spectrum because they feared interference problems.
But as the United States wrestled with how to more rapidly ramp up its broadband infrastructure, this could be one direction to create a smarter wireless revolution.
Your Friendly Neighborhood Smart Planeteer,
IBM Global Business Services