Instrumented Interconnecteds Intelligent
Archive for January, 2011

startupamericalogoThe White House launched the Startup America Partnership today: a public, private effort aimed at accelerating high-growth entrepreneurship in the United States.  As one of the companies in the partnership,  IBM committed to invest $150 million in 2011 to fund Continue Reading »

Bookmark and Share

By Harriet Pearson

Today’s society is built on the fast flow and analysis of bits and bytes of information. The strides we make in gathering, routing, and analyzing this torrent of data holds the promise of an ever-brighter future. Still, behind these data are real people, real organizations, and real concerns, so we need to reconcile the competing goals of free information flow and individual privacy.

To support the ongoing discussion of this critical issue, IBM is joining other leading global businesses, non-profits, individuals and governments in celebrating international Data Privacy Day.

HarrietPearson_smallDigital privacy can seem elusive. If you’re a consumer, it can be difficult to figure out what information companies have about you, or where they’re getting it, or how they’re might use it. If you’re a business or government leader, it can be hard to figure out how to  responsibly use the personal data concerning individuals to do things such as conserve energy, reduce traffic congestion and suppress crime.

What’s at stake? Plenty. Getting data privacy “right” is an economic and social imperative. Trust and confidence in the security and privacy of the critical systems of our planet – especially the digital version of its central nervous system, the Internet – is foundational to individuals’ continued engagement and reliance on such things as online commerce, e-health and smart grids. If individual consumers don’t feel that their privacy and security are protected, they will not support modernization efforts, even though the capabilities of technology advancements are proven and the potential benefits to society are extensive.

Here’s an example of the tensions we face: The ability of smart grids to conserve resources relies on the ability of, and commitment from, consumers to monitor and modify their individual usage. An individual using a smart meter understands the difference in the cost of using electricity at peak versus non-peak hours and could opt to lower their usage during more costly time periods. At the same time, data from the meters can reveal sensitive information such as work habits, shower schedules, use of medical devices such as dialysis, and whether or not a house is occupied.

So, how does society move ahead with smart grids and other technology advances that rely upon individual or personal data, while addressing consumer privacy?

Continue Reading »

Bookmark and Share

Because we live our lives digitally, there’s no turning back the clock to simpler times when personal data was locked up in file cabinets and bank safe deposit boxes.  We spew information about ourselves into the cybersphere via Facebook, LinkedIn, Twitter, Web sites requiring registration, personal identity cards and other kinds of smart cards. But is there a way that we can dole out information in small, controllable pieces–just enough to get things done but not a byte more?

The answer is yes, and a European research consortium is leading the way to delivering this capability on a mass scale.

The consortium, called ABC4Trust, is building safeguarding systems based on privacy-protecting technologies from IBM and Microsoft.  It plans on testing the systems in a university in Greece and a secondary school in Sweden. The technologies, called Attribute-Based Credentials (where the ABC in the name comes from), make it possible to build Web services and electronic ID systems that get just enough information to authenticate peoples’ identities, qualifications and permissions–but no more.

Continue Reading »

Bookmark and Share

By Stephen Baker

Picture the smart, unassuming person at a meeting, who says, “Well, I’m no expert, but once I saw this case where….” That person is doing something that until recently was uniquely human: Soft-pedaling an idea.

Humans beings can soft-pedal because we know what we know (or at least think we do). We also know what we don’t know. And then there’s this entire domain of knowledge in which we know a thing or two. That gray area in the middle is important, because that’s where we can dabble. We can come up with insights and discuss them with people who are better informed. This process widens a discussion beyond the cloistered world of experts. It can lead to insights, the generation of hypotheses, and innovation.

Steve Baker

Steve Baker

One of the very special aspects of Watson, IBM’s Jeopardy computer, is that it “knows” what it doesn’t know–or, more precisely, simulates this knowledge through statistical analysis. Looking beyond Jeopardy, to Watson’s career in business, this gauge of its confidence is one of its most valuable features.

Say Watson is working in a hospital emergency room. A person comes in with a combination of symptoms that no one has seen before. Someone lists them for Watson. (The machine doesn’t have voice-recognition now, but that could be engineered in a matter of days.) Watson scours its base of documents and research papers, finds various combinations of these symptoms, and lists possible diseases or disorders that the person might have. Each one is accompanied by a confidence gauge. It turns out in this case Watson has only 14% confidence in, say, lupus.

That 14% amounts to a big shrug of Watson’s electronic shoulders. It does not know and is admitting as much. And maybe the doctors have done tests and know that it’s not lupus. But maybe below Lupus, with only an 8% confidence rank, is some other disease that they hadn’t considered. It may be wrong. It may be idiotic. But it may also lead to a thought, a connection. After all, that 8% came from some combination of the symptoms that Watson found in its research. In effect, Watson–even in its ignorance–has come up with a list of hypotheses (along with pointers to its sources). If even one of these hypotheses nudges doctors toward a correct diagnosis, the machine has provided a service–even without “knowing” the answer.

Continue Reading »

Bookmark and Share

annafEditor’s note: This is a guest post by Anna Fredricks, IBM Healthcare Industry market manager.

IBM’s Watson will analyze natural language questions, posed as Jeopardy! clues, on U.S. television, February 14-16.  What else would be possible in a Watson-enabled future?  What if Watson could help physicians find evidence and scientific support for diagnosis and treatment decisions? What if Watson could act as a physician’s assistant during patient consults, crunching data from similar cases as the physician takes voice notes?

While medical centers don’t have a Watson to interact with, as Jeopardy! host Alex Trebek will, the same Smart Analytics technologies that Watson relies on to answer the quiz show clues is available today and is being applied to solve healthcare problems.

Creating value from health data

While poor information still hinders the ability of healthcare providers to consistently and accurately diagnosis patient conditions, today’s health analytics solutions are giving care providers, administrators and insurers new ways to understand what works, what does not work and why.

For example, the University of Ontario Institute of Technology saw a need to better detect neonatal babies’ subtle warning signs of complications – but doctors needed greater insight into the moment-by-moment condition of the babies.

YouTube Preview Image

IBM health analytics solutions (in this case, a first-of-a-kind stream-computing platform) provide the insights that give caregivers the ability to proactively intervene in the case of complications, such as detecting infections in premature infants up to 24 hours before they exhibit symptoms. This capability provides an unprecedented ability to glean insights from vast amounts of data, to spot trends in real-time and to take action.

According to the April, 2010 Harvard Business Review, doctors inaccurately or incompletely diagnose patient illness an estimated 15 percent of the time.

In another example, the University of North Carolina Health Care System decided to connect the complex and diverse information from across its operations.

YouTube Preview Image

Sources of UNC’s information range from patient admissions data, to lab results, to radiology images. For this information to be useful across all of its operations, UNC needed to first synchronize these disparate sources into a “single version of the truth” – in the form of a data warehouse later called the Carolina Data Warehouse for Health (CDWH).

“Our key challenge is that every part of the UNC system – the academic medical center, the school of medicine and schools within the university – as well as outside institutions, wants the data that the data warehouse offers,” said Dr. Donald Spencer, Associate Director of Medical Informatics for the UNC Health Care System.

Now, physicians to researchers can feed data, from patient records to X-ray images into the CDWH. The data warehouse’s single-point of access and healthcare informatics capabilities allow researchers, clinicians and administrators from other areas of UNC to analyze the data – leading to improved patient care.

A medical Watson

And in the future, doctors may be able to couple IBM’s currently available health analytics solutions with the Natural Language Processing (NLP) technology in Watson to discuss a patient’s exam results. A medical Watson could dissect information from similar cases, producing diagnosis suggestions, in real time, backed by evidence for the doctor’s review.

Health analytics solutions that exist today could benefit from advances in analytics that come as Watson’s team of researchers push the boundaries of the technology in this Grand Challenge.  While I’ll be watching Watson compete on Jeopardy! on February 14 and cheering for the team to win, I’ll actually be cheering for the progress this challenge will provide in terms of improved physician assistance and patient care.

Bookmark and Share

Scientific breakthroughs can’t be scheduled. Sometimes things happen fairly quickly, like when IBM Zurich researchers Gerd Binnig and Heinrich Rohrer invented the scanning tunneling microscope, which for the first time allowed scientists to “see” individual atoms. The two physicists came up with the idea in 1981, demonstrated their invention two years later and won the Nobel Prize in Physics in 1986. In contrast, racetrack memory, a revolutionary new method for storing digital data, has come along more slowly. Its path from concept to reality demonstrates the patience and the passion that’s required of researchers who aim to produce world-shaking inventions.

Earlier this month, the scientists at IBM Almaden Labs who are working on the racetrack memory project reported an advance that gives hope that their efforts could pay off in the not-too-distant future.

Racetrack memory is a method for storing data on three dimensional microchips. It promises to pack 100 times more data in the same space as either a hard disk drive or a traditional memory chip at a much lower cost than either–while using much less energy. If and when racetrack memory is commercialized, it could represent a major advance affecting every aspect of IBM’s Smarter Planet agenda, including sensors, networks, mobile devices, and data analytics. For instance, it could make it possible to store every movie made in a year on a single portable device.

IBM Researcher Stuart Parkin

IBM Researcher Stuart Parkin

Almaden researcher Stuart Parkin had been thinking about alternatives to traditional memory and storage devices for many years, but the racetrack memory idea came to him about eight years ago when IBM sold its hard disk drive business. It occurred to him that the basic technology for the hard disk drive had been around for decades. The same was true for the technology behind memory chips. He saw that the most significant limitation to both methods was that they were two dimensional, limiting the amount of data that could be stored in a small space. “I asked, ‘Is there a better way of designing a disk drive without moving parts?’” he recalls.

Parkin came up with the concept of using a dense web of tiny magnetically charged nanowires attached to the surface of a silicon semiconductor for storing data. He combined that with the idea of reading and writing data via stationary sensors embedded in the silicon. The data is stored in so-called domain walls on the nanowires. Imagine racetrack memory as a tiny roller-coaster and the bits of data as people climbing on the cars (domain walls) and taking a wild ride up and down the nanowires.

Because the invention was so novel, Parkin received his first patent quickly–in 2004. As he and his colleagues received additional patents, they began talking and writing about their inventions publicly–including in two papers in Science magazine in 2008. Their newest missive, also in Science, explains how they established that domain walls can be moved along the nanowires precisely enough to used to store bits of data.

At this point, Parkin says, the racetrack memory concepts have been proven in experiments.  The next step is to create a prototype racetrack memory device  that could lead to mass production and commercialization. “We know there is no scientific roadblock in the way that could impede racetrack,” says Parkin. “The question is time and money.” He says it would be possible to produce a prototype in three years and commercialize the technology in five.

Will IBM come up with the funds to put racetrack memory on the fast track? Will it seek partners to help fund the project and spread the risk? These are questions I can’t answer.

But what’s clear already is that a novel new technology has emerged into the light, and, if and when it comes to market it could revolutionize computing and consumer electronics.

This story also illustrates one of the eternal truths of scientific exploration: Scientists stand on the shoulders of other scientists. Parkin and his research colleagues couldn’t have produced their breakthrough without the nanotechnology tools that resulted from Binnig and Rohrer’s invention of the scanning tunneling microscope.

Bookmark and Share

Editor’s Note, this is a guest post from John Lucas, Director of Park Operations for Cincinnati Zoo and Botanical Gardens

Sustainability is a critical issue throughout the world today, and will continue to be for many generations to come. According to Wikipedia, sustainability is loosely defined as the capacity to endure – and this can be interpreted in many different ways as sustainability takes on social, economic, and environmental dimensions. The conservation of animal and plant life remains at the forefront of sustainability efforts, and is a mission and core focus of many Zoo’s around the globe that are responsible for the lives and well being of animals.

At the Cincinnati Zoo, the main vision is inspiring passion for nature and saving wildlife for future generations. Our commitment to conservation is showcased through our global science and wildlife conservation programs, sustainable approach to the management of our facilities, and dedication to education and public engagement in science and conservation at our exhibits. The Cincinnati Zoo features more than 500 animal and 3,000 plant species, making it one of the largest Zoo collections in the country, and we continue to set the standard for conservation, education and preservation of wild animals and wild spaces.
In order to be able to successfully carry out our mission and overall vision we need to make sure we can provide the best possible care to the animals, maintain and open new exhibits, and keep our visitors satisfied and returning on a regular basis to generate new revenue streams for an ongoing business. About a year ago, we had an opportunity facing the Cincinnati Zoo. More than 1.2 million people a year visit our exhibits, and that number was increasing on a consistent basis. Good news, but we had a bigger challenge longer term — How could we maximize the recent increase in attendance and raise guest spending? Additional revenue would allow our management to provide that additional care for zoo animals and add new exhibits to keep up with growing demand.

To keep our own facility running in a sustainable fashion, we needed to make sure we were maximizing our resources properly. We came to the conclusion that technology was an issue and this is when we turned to what some might think of as a non-traditional helping hand for a Zoo – business analytics. Almost immediately after going live with IBM analytics software – the growing mounds of information was turned into knowledge for our staff to improve operations. We were able to increase our in-park spending by as much as 25% by utilizing 360 degree customer views. We turned that information into customized offers and perks for our visitors that keep them happy and coming back, and are now able to arm our managers with real-time data that allows them to react to a dynamic and fluid business driven by seasonal weather patterns.

The results — business analytics has also allowed us to integrate our operations, which means we are running a more sustainable business ourselves. This has helped free up our staff’s time so they can focus on the day-to-day operations in a more meaningful way, while also focusing on the larger picture of ensuring our animals continue to receive the best care. Further, our revenue has increased 350K per year, which enables us to dedicate more resources to the well-being of the zoo animals. In the end everyone wins, our visitors are getting a more enjoyable experience and we can run a more efficient business that allows us to better promote our overall mission of protecting wildlife and promoting its education and conservation.

You can learn more about our story here: YouTube Preview Image

Bookmark and Share

Posted by
Brandi Boatner in

As the unofficial theme song of New York City (Jay-Z & Alicia Keyes “Empire State of Mind”) plays, it sets the stage in an Expo Hall filled with thousands of people for the 100th National Retail Federation (NRF) Big Show event.

Jill Puleri, Global Retail Leader, IBM Global Business Services, kicked off the opening Super Session for the 2011 NRF Conference. Puleri addressed the crowd of retailers on the importance of capitalizing and understanding the smarter consumer.

Puleri’s opening remarks began with a history lesson on retail and IBM’s role in retail. She discussed how the consumer has changed over the last 100 years and how IBM has fundamentally changed retail with calculating and tabulating machines and the emergence and creation of the bar code. The point of the history lesson is “The consumer is smarter today than ever before,” states Puleri.

In understanding the smarter consumer, technology is a given. Today’s consumers are leveraging mobile devices, Web 2.0, and sharing information on multiple social networks. Consumers are more connected and vocal about their needs and wants. Content for consumers is viral, immediate and highly influential.

According to Puleri, the smarter consumer does not want to be sold to, they want to be served. Sales associates must become services associates where retailers must empower the consumer, placing them in the center of all retail operations.

Smarter retail means leveraging customer data. Retailers can use advanced analytics to better understand the consumer and what influences and motivates them to purchase goods.

As the NRF celebrates 100 years of excellence, retailers need to go back to the basic fundamental roots of retail- service. Service is the foundation of the personalized customer experience and IBM can help retailers offer that experience.

Bookmark and Share
January 5th, 2011

Bruce Anderson

Editor’s note: The following is a guest post from Bruce Anderson, General Manager, IBM Global Electronics Industry

As I head out to the annual Consumer Electronics Show in Las Vegas, it occurs to me that a more apt name for this mega–event would be the Consumer “Experience” Show.

In the midst of commoditization pressures, heightened consumer expectations and ongoing digital convergence, Consumer Electronics (CE) companies need to take a long look in the mirror and ask themselves what business are they really in?

After all, today it’s not just about selling cameras it’s about the business of creating and sharing memories. And it’s not just about selling mp3 players – it’s about a personalized music experience.

And while TVs of all sorts will be on display at CES, it’s no longer just about selling TVs. It’s about providing a personalized entertainment and shopping experience — TVs that can not only display Internet content, but also tailored, targeted advertising and the realtime ability to purchase items.

And this “experience” theme isn’t limited to the entertainment realm. It’s making its way into other areas such as the laundry room where the focus isn’t just on washing machines that can run when energy prices are lowest, it’s on communicating how much consumers are able to save. And it’s not just about dryers that can call for service before they break down, it’s about making sure that the serviceman shows up at the house to fix the dryer. It’s about giving consumers a way to be eco-friendly and simplify their lives.

CE manufacturers realize they need to be part of a new ecosystem wrapping a great experience around their products, and they need to find ways to monetize the consumer experience, not just the device.  Consequently they are moving away from a business model focused on hardware to one based on complete solutions and services. A move from engineering-driven product development, to one that garners user insights and analytics to drive product innovation.

We’re seeing the move from “point” products with simple functionality to products that are connected to large ecosystems, using sophisticated, embedded and secure software. We’re seeing shifts away from vertically integrated supply chains, and from a winner takes all approach to an ecosystem based on mutual sharing of risk and reward.

This is our new reality…and at the heart of this new reality is the shift from selling pure products to selling products and services to create new value.  What business are you in?

Bookmark and Share

Ever since Harvard sociologist Daniel Bell published his book, The Coming of Post-Industrial Society, in 1973, there has been a strong sense of inevitability about the rise and dominance of services in the world’s advanced economies. And, in general, people have concluded that this is a good thing. But there’s danger lurking in services. At this point in their evolution, they’re less efficient and productive than modern manufacturing and farming. Also, while manufacturing took over 200 years before its “quality revolution,” services have only been dominant for a few decades and have yet to figure out quality. These issues  could mean big trouble not just for developed countries but for the entire global economy.

Some of today’s top thinkers about services are sounding alarms. Robert Morris, head of service research at IBM Research, says that unless services become more scientific and technical, economic growth could stagnate. Henry Chesbrough, the UC Berkeley professor who coined the term “open innovation,” says this is a major issue facing the world economy long term. He calls it the “commodity services trap.”

Underpinning their thinking is an economic theory called Baumol’s disease. The idea is that as services become an ever larger piece of the economy, they consume an ever larger share of the human and capital resources–but don’t create enough value, in return. Think of an electricity generation plant that consumes more energy than it produces. “Productivity and quality of services isn’t growing comparably to other sectors, including manufacturing and agriculture, so the danger is that it swamps the economy–employment, the share of GDP, and what people have to pay for,” says Morris. “The world economy could stall.”

Developed nations are particularly vulnerable to Baumol’s disease. In Europe and the United States, a lot of progress has been made in the past decade in improving the efficiency of IT services, but other service industries and frightfully inefficient and ineffective: think government, health care and education.

So while adding jobs is vitally important to countries that are still reeling from the economic meltdown, if the jobs that are added are commodity service jobs, long term, it’s adding to the inefficiency of the economy. That’s why governments need to invest aggressively in science and education and technology to improve services  in spite of their budget deficits.

One area that deserves investment is service science. It’ s the academic discipline that IBM (with help from Chesbrough) began promoting in 2002. A multidisciplinary approach, service science addresses Baumol’s disease head on by using the ideas and skills of computer science, engineering, social science and business management to improve the productivity, quality and innovation in services. Many of the techniques that have already been developed in computer, mathematical and information sciences can be directly applied to helping services.  But new breakthroughs and the better interactions with behavioral and business sciences are also essential, because services are, and always will be, people-centric businesses.

Today, more than 450 universities worldwide offer some sort of service science program. But much more can and should be done to avoid falling into the commodity services trap. Otherwise, the the post-industrial society could take on a post-apocalyptic tinge.

Bookmark and Share