Instrumented Interconnecteds Intelligent
Archive for June, 2010

LA has the reputation, deserved or not, for having some of the worst traffic jams. Well, the pressure’s off. A first-ever global survey of motorists in 20 large cities conducted by IBM shows that when it comes to traffic, LA is practically commuter nirvana compared to some of the world’s other metropolises. On a scale of 0 to 100, taking into account such variables as commuting time, stuck-in-traffic time, and driving-caused stress, Beijing, Mexico City, and Johannesburg were practically off-the-charts painful, with scores of 99, 99, and 97. Meanwhile, LA scored 25, just six points  higher than New York City. The best places were Stockholm, with a score of 15, and Melbourne, 17.

The results of the survey point to a growing global need: Better management of transportation systems to get people where they want to go faster. “In the mega cities in fast-developing countries, they need to address these issues with a high level of urgency or their transportation systems will break down completely. Every street could become a parking lot,” says Naveen Lamba, the industry lead for intelligent transportation in IBM’s Global Business Services division.

Commuter Pain Index

The detailed results of the survey show that many of the efforts to take the pressure off highways aren’t catching hold. For instance, carpooling gets only low-single-digit participation in most of the cities. New Delhi, with 11%, and Johannesburg, with 8%, are a couple of the relative bright spots.  More typical are Buenos Aires’ 4% and Houston’s 3%. In the United States, neither the establishment of HOV lanes or commuter parking lots has made much of a difference. The ranks of telecommuters are sparse all over, too. Just 4% of those in Johannesburg work at home–the highest rate. It’s zero in Madrid, Moscow, Beijing, and Mexico City.

Indications are that the situations in some burgeoning cities will only get worse. Right now only 39% of commuters in Beijing drive their own cars, compared to 92% in LA. But the situation is changing fast. The number of new cars registered in Beijing in the first four months of 2010 rose 23.8% to 248,000, according to the Beijing municipal taxation office. Clearly, when more people in Beijing own cars, the authorities will have to add even more ring roads to the ever-growing network of  highways encircling the city.

Fortunately, Beijing authorities aren’t counting on highway projects alone to address their exploding transportation needs. Beijing’s total investments in its subway system are projected to be nearly $50 billion through 2015 as the city  more than doubles its current reach, according to Beijing Infrastructure Investment Co., Ltd.

Across the globe, relief will come only when cities and metropolitan regions consolidate authority over all or most transportation modes on a single agency–or a small handful of agencies. Lamba says they need to coordinate the operations of everything from roads and bridges to ferries, trains, and subways. That way, they can put together a package of incentives and disincentives that redistribute commuters to different modes of transportation–with the primary goal of removing many one-person cars from the roads at peak travel times. Such an approach is working in Singapore and London, and is beginning to work in Dubai.

I saw one bright spot in the survey results that gave me a little bit of hope for the future: a handful of cities where large numbers of people bicycle or walk to work. For instance, 23% of Amsterdam’s commuters use bicycles as a primary mode of transportation; and 10% of the people in Buenos Aires walk. Unfortunately, in many cities, the places where people work and live have been divorced from each other, so there’s little hope of changing the situation in any meaningful way. Or, maybe that’s too bleak a conclusion. What if cities set up something like those airport people conveyors on sidewalks or streets? Weather’s a factor, sure, but maybe there’s a way it could be done.

YouTube Preview Image

Mexico City’s congestion problems

Bookmark and Share
June 25th, 2010
17:18
 

Posted by
Steve Hamm in

Check out the very cool graphics the New York Times is doing on the World Cup. They demonstrate the power of combining data and visualization.

Bookmark and Share

Editor’s Note: Following is an essay co-authored by Bob Sutor, vice president of open source and Linux for IBM, and Jean Staten Healy, director of cross-IBM Linux strategy for IBM. It describes the central place Linux plays in building a smarter planet, and builds on a presentation about the role of Linux in Smarter Systems, which the two IBM executives gave at the recent Red Hat Summit.

What do you think about when you read or hear the word “smart” when it is applied to computers? How about a supercomputer? If any machine is smart, a supercomputer is, right?. According to a study released by the University of California at Berkeley in May, 2010, 470 of the 500 fastest supercomputers in the world run Linux, the open source operating system. That’s 91%. Evidently the people who decided to use Linux for these computers were pretty smart too.

As we think about all the ways where we can work together to create a Smarter Planet, Linux has a very natural role. First, Linux runs on more kinds of hardware than any other operating system. So if we are talking about tying together disparate systems to deliver better, more accurate, and more predictive health care, Linux can power the hardware and software to maintain the information repositories, do the data mining, and perform the analytics. That is, Linux can help provide the intelligence we will need and expect in our complex and sophisticated 21st century systems.

Linux runs on the smallest devices all the way up to the fastest supercomputers, as noted above. Linux today powers smart phones, Netbooks, laptops, desktops, and servers in datacenters, but also home automation and many embedded systems. Linux will be at the heart of smart electrical grids that allow utilities to reduce waste, remotely manage and monitor use, and help reduce costs to consumers. Linux will increasingly be part of the instrumentation that provides the data we will use to tune and optimize not just our electrical grids, but also our water systems, supply chains, and factories, to name a few examples.

As the data is collected from the sensors, Linux can help ensure that it goes where it needs to go to do the most good. In order to reduce pollution, cars need to be inspected and kept off the roads until they are compliant with emission standards. Linux can power websites where citizens can pay fees and schedule inspection appointments in a low friction manner. Then once the inspections are complete, Linux systems can push the data to local and regional authorities, but also to repositories and software that measure not only compliance but perform data analysis. This will yield important information to further improve the system, and reduce pollution even more. Our systems need to be more interconnected, and Linux can help them be so.

Linux is global and supports many languages and locales. The tools needed to create a Smarter Planet must run in the heterogeneous environments that we have today. Linux is a big part of how we instrument, interconnect, and derive intelligence from the information around us. As we optimize the systems we have today and develop entirely new ones to solve problems in better ways, don’t be surprised to see Linux inside.

Dr. Robert S. Sutor: Vice President, Open Source and Linux, IBM Software Group
Jean Staten Healy: Director, Cross-IBM Linux Strategy, IBM Systems and Technology Group

Bookmark and Share

“Prototypes were finished and tested; By late winter the design and development was complete; And in April, everything was up and running.”

Here’s another true story from IBM’s First-of-a-Kind (FOAK) program, which pairs IBM researchers with clients to bring incredible discoveries and possibilities into view.

YouTube Preview Image

Bookmark and Share
June 24th, 2010
14:24
 

[kaltura-widget uiconfid="1806402" entryid="0_enyn6hah" width="400" height="360" addpermission="" editpermission="" /] IBM and BCBSMA Deliver New IT SolutionThe healthcare industry looks dramatically different than it did a year ago, giving rise to new opportunities for improvement all around—from how doctors capture and share medical information to how health insurance companies manage their costs. IBM is at the epicenter of this transformation working with many of the major hospital systems and a large number of health insurance providers to navigate a new landscape and deliver on the promise of “Smarter Healthcare”.

Today, IBM is announcing it has been tapped by another major health plan organization, Blue Cross Blue Shield of Massachusetts (BCBSMA), to revamp its entire information technology (IT) infrastructure, encompassing everything from data center management to overseeing the applications portfolio. Central to the project is the primary member Web site, bluecrossma.com that is designed to provide BCBSMA’s nearly 3 million members fast and secure online access to claims and general healthcare information. The new agreement is expected to generate approximately $16 million annually in savings for BCBSMA.

“The big driver for choosing IBM was the breadth and depth of capabilities IBM brings to the table,” said Bill Fandrich, chief information officer, Blue Cross Blue Shield of Massachusetts. “The IBM team proved over and over again their ability to bring different products and integrate them onto one platform. IBM provides a lot of flexibility for us to leverage IBM’s partners, whether it’s for development or whether it’s for other product capabilities. It’s very important in this day and age when we’re trying to bring solutions to the market that we have an IT provider that’s looking at things from the lens of what we need, not just at what they can sell us.”

BCBSMA is among several major health plan organizations that have recently turned to IBM. Kaiser Permanente last year tapped IBM to deploy smart systems and activate its highly sophisticated global delivery network to provide patients, members and physicians real-time access to medical data and tools whenever and wherever they need it. Similarly, IBM is working with National Account Service Company LLC (NASCO) to manage its entire IT system as well as work with IBM Research to modernize its claim processing system. IBM is working with CIGNA in a multi-year strategy to place an entirely new focus on improving customers’ experience, including tapping customer information to make interactions with the company more personalized. More recently, IBM signed a deal with athenahealth Inc., a leading provider of Internet-based business services for physician practices. IBM will enable athenahealth to focus its resources on simplifying and improving administrative and reimbursement processes while reducing staff workload, allowing physicians to focus on delivering higher quality care to their patients.

The work that IBM is doing is hardly limited to the healthcare in the US. For example, in China, by integrating data from health records that combine Eastern and Western medicine and applying sophisticated analytics, doctors and nurses at Guang Dong Hospital of Traditional Chinese Medicine can figure out which treatment plans and techniques from each approach work best for specific diseases and medical conditions. In Italy, the Rizzoli Institute is transforming treatment of hereditary bone disease by using analytics while Europe’s non-profit research alliance EuResist Network GEIE, is working with IBM to improve treatment for HIV patients using a prediction engine that simulates the intervention of HIV treatment drugs within the human body.

While other companies are making claims about helping shape the future of healthcare, IBM is actually doing it. IBM is making systems “smarter” for healthcare payers and providers as they are faced with new pressures to improve efficiency and create new business models to serve their clients while holding the line on costs. IBM is helping companies transition from outdated IT environments to more modernized systems capable of analyzing data and predicting errors, as well as integrating data so doctors, patients and insurers can share information seamlessly, securely and efficiently. IBM is creating a smarter, more connected healthcare system that delivers better care for everyone involved.

Bookmark and Share

How do you involve thousands of people in shaping how a city should be run? In a word, Jam.

Coventry City Council in the UK is running the first city based Jam, to take the conversation beyond city leader to its citizens. Why? Because none of us have all the answers. Pulling from a wider pool of people, experiences, backgrounds and expertise will give Coventry an edge in finding out what it needs to provide to its people and business.

photograph of the Coventry, England taken from...

Image via Wikipedia

Coventry CC is calling their event the CovJam, and it will be taking place on 29 and 30 June and 1 July.

It looks to be a great event for the people in and around Coventry (or even from Coventry) to shape the way they live. A real step on the way to building a smarter city in the heart of the UK.

If you would like to be one of those taking part, please e-mail:communications@coventry.gov.uk with the subject line “CovJam”

What’s a Jam?

A Jam is an online discussion (think brainstorming on an epic scale) around a group of pre-selected themes, that an organisation wants to find innovative answers to.  Within the themes there are many discussions happening at once. The event is driven by specially invited subject matter experts, stakeholders and hosts, that help highlight interesting and valuable contributions from people like you and me, taking part in the Jam.

As you may know we have had lots of Jams in IBM, its become part of the culture and especially this version which is a mini-Jam, essentially a more focused Jam with fewer themes.

This Jam will be covering the following themes:

  • The rebirth of Coventry: The urban design for a future city. What do we do to the centre of Coventry to make people want to live here, work here, shop here, socialise here?
  • Sent to Coventry: Be inventive. What does Coventry really want to be known for?
  • Aspiring Coventry: Yes we can! Aiming high and fulfilling our potential. How can the people of Coventry believe in themselves and their city?
  • Community Cohesion: Getting on together and celebrating diversity. As the city continues to grow and change, will it remain relaxed and at peace with itself with its citizens feeling a strong sense of place and able to get along with each other
  • Citizens in the driving seat: The relationship between the state and the individual

Good luck to everyone taking part – Jams are usually a blast.

More details on the Coventry CC web site.

Enhanced by Zemanta

Bookmark and Share
June 21st, 2010
10:53
 

Peter Hirshberg in the Gray Area GalleryIBM isn’t the only organization that thinks cities could be a lot “smarter.” For six years, students and faculty members at MIT’s SENSEable City Lab have been investigating the potential for digital technologies to improve the experience of living in cities. They’ve completed dozens of projects around the world. The Gray Area Foundation for the Arts, a non-profit in San Francisco focused on the intersection of digital art and social progress, recently opened a show, senseable cities, highlighting 15 of the Lab’s projects. Very cool stuff.

The concept behind these projects is simple. Gather data about city life from a wide variety of sources, crunch it, and display it in visual forms–so it has maximum impact. One of the projects, Trash Track, uses cellular GPS tags attached to a variety of different kinds of refuse to follow its path from the dumpster to its ultimate resting place. Another, Copenhagen Wheel, captures traffic and pollution data gathered from bicycles. A third, Real Time Rome, uses mobile phone use patterns to show the movement of people after sporting events in the city.

I saw the show last week along with a handful of IBMers and creative agency colleagues. Our guide was Peter Hirshberg, a former Apple executive and serial entrepreneur who is on the foundation’s board of directors. He told us that bringing together data, analysis, and visualization “puts the science back in social science. You can begin acting on this stuff.”

One of the most empowering aspects of the projects is that citizens don’t just see their world mapped out in new ways, they participate in the mapping–which increases their commitment to making change happen in their communities.

The Gray Area gallery  is located in San Francisco’s Tenderloin neighborhood, the city’s longtime Red Light district–a gritty area with a high poverty rate and a lot of homeless people. In fact, the gallery is housed in a former porn video parlor, and still has the funky “Arts Theatres” marquee out front.

One of the foundation’s goals is to improve the neighborhood. In connection with the senseable cities show, it teamed with a local public television station (KQED) and other community organizations to sponsor an event called CITYCENTERED, a symposium, workshops, and neighborhood walk aimed at getting people engaged in the community. One piece, for example, Urban Remix, was a participatory media project that uses mobile phones as a platform for capturing the sounds and images of city neighborhoods–useful for documenting noise pollution and other obnoxious messes.

For people like me, who love cities but wish they were a bit more livable, this stuff is exciting. If you want to learn more or get involved, the show runs until Aug. 11 at the Gray Area gallery, 55 Taylor Street. But if you can’t see the show there, it will be traveling to San Jose and Amsterdam later this year and to New York, Tokyo, and other places next year.


Bookmark and Share
June 18th, 2010
15:44
 

Posted by
Guest in

miller2Following is a guest post from David Miller, DePaul University.

In preparing for our launch of the new DePaul University Center for Data Mining and Predictive Analytics , I did a simple Google search and typed in the word, “data.”  Thanks to Google’s suggestion capability, I didn’t even have to finish typing before the first suggestion popped up as “Data Mining.”

You see data mining is incredibly important, you don’t have to take my word for it.  And, whether we like it or not, analytics is in our future.

This year alone, there will be 1,200 exabytes of data generated from sensors, electronic forms, audio and video clips, e-mail, blogs, social networks, web searches and financial transactions. Data is streaming at us, and from us, in all directions.  Business and governments alike are grappling with the challenge of making sense of this data deluge to turn it into new opportunities, increased performance and faster, better decision-making.  

The power of analytics is transforming this information into a strategic asset. Although, having the best, most complete and up-to-date information is useless if you can’t make sense of it.  I’ve always said that data unanalyzed, is data wasted. Therefore, businesses and governments need two very important things to make this happen: the right technology and employees with the right expertise and skillsets.

To help organizations tackle these challenges and give the next-generation of knowledge-workers the competencies they need, DePaul University, in collaboration with IBM, has launched a new Center for Data Mining and Predictive Analytics, as well as a Masters in Predictive Analytics program. Opening in September 2010, this applied research center will train future leaders on data mining and predictive analytics, and meet increasing demands for experts who can apply this technology to problems such as traffic management, energy management, public health planning and city services.

The Center represents a central point of contact between industry and academia, preparing students for future jobs, enabling collaboration between researchers and spreading the gospel, so to speak, about the value and benefit of predictive analytics.

DePaul University is on the cutting-edge of creating a needed supply of incredibly intelligent professionals fluent in computational and analytical skills, with the business knowledge necessary to enhance business processes — from customer acquisition in a marketing department or fraud detection in an insurance company, among the myriad of business problems to solve.

Today’s young people who master this area will be in high demand.  In fact, they already are.  And, thanks to the participation from leading technology companies such as IBM which is donating resources in the form of guest lecturers, This collaboration is poised to give students a highly marketable skill and the rare opportunity to acquire real-world knowledge that should benefit them — and society — for years to come.

David Miller, Dean of College of Computing and Digital Media at DePaul University.

Bookmark and Share

YouTube Preview Image

It turns out that at least one aspect of researching DNA, atoms, chemicals and minerals is surprisingly old fashioned. More to the point, scientists looking to produce pharmaceuticals, electronics and novel materials have been dependent on rather manual, time-consuming techniques to perform their work. But a volunteer network of about 1.5 million PCs, all chugging away in tandem, may be changing all of that.

I should probably explain.

When scientists research the characteristics of microscopic matter, they need to turn a soupy brew into more solid 3D crystals for easier examination by x-ray.  But doing this on a large scale is remarkably tedious and very hit-or-miss.

Then actually confirming that they’ve properly crystallized is a throwback to the nineteenth century, as researchers need to examine each sample individually.

Now, if this doesn’t seem like the best use of scientists’ time, you’re right. But until recently, computers weren’t smart enough to verify that crystallization had occurred.  There the scientists and technicians sat — staring intently at pipettes and test tubes to see if their samples were ready for further analysis.

So, scientists at the Ontario Cancer Institute, working on a project called “Help Conquer Cancer,” trained a computer system to automate and speed up the process. They’ve gotten it to the point where it’s able to accurately recognize 80% of the crystallized samples it scrutinizes. This lets them look at six times as many images per protein, and in dramatically less time.

It’s kind of a big deal. An article about the breakthrough, authored by the lead researchers of the Help Cure Cancer Project, was recently published in the Journal of Structural and Functional Genomics.

In earlier stages of automating the process of verifying crystallization, computers were accurate about 70% of the time, at best. And they could examine only about 850 features of a protein, compared to about 15,000 now capable of being analyzed.

Even if a person tried assessing one image per second, it would take 1,333 days to examine all 100 million images of 12,500 proteins captured in the course of 19.2 million experiments involved in the project. Human evaluations can also be inconsistent. And they get cranky.

But here’s the kicker: the advance might potentially be applied to other research projects involving crystallization. And for those projects that use other means to determine crystallization, this system could help validate their accuracy. One such effort is another World Community Grid project to develop hardier, healthier rice.

Believe it or not, a traditional supercomputer wasn’t used to create and power this verification system. Instead, researchers used the World Community Grid. For the uninitiated, the Grid aggregates the unused cycle time of 1.5 million personal computers donated by hundreds of thousands of volunteers in more than 80 countries. It’s actually the world’s largest public humanitarian grid, equivalent in strength to one of the world’s most powerful supercomputers.

IBM runs the Grid with contributions of software, hardware and know-how. As with open source software, scientists agree to freely share results and techniques developed from grid projects.

Other projects running on World Community Grid include: Discovering Dengue Drugs Together; Help Fight Childhood Cancer; FightAIDS@Home; Nutritious Rice for the World; and an effort to help cure muscular dystrophy and other neuromuscular diseases.

If you want to get in on the action, and donate the surplus processing power you never knew you had, just register at World Community Grid’s Web site. You can also become a fan of the Grid on Facebook. You’ll be installing a free, unobtrusive and secure application on your PC running either Linux, Microsoft Windows or Mac OS. You won’t ever know that it’s quietly doing its thing.

Solving the world’s problems while you surf the Web. How’s that for multi-tasking?

Bookmark and Share

Last July 15, John E. Kelly III, director of IBM Research, conducted a daylong meeting of directors and department heads at lab headquarters in Yorktown Heights, New York.  At one point, a presenter showed a map of the globe with dots over most of it indicating the locations of research collaboration projects. One glaring exception was Latin America. Kelly sat near the front of the room with his legs crossed and his chin resting on his hand. “Look, guys. South America. Nothing yet,” he said, gesturing at the map. “You’ve got to get started.”

Well, get started they did. Today, IBM announced its newest laboratory location, in Brazil–our ninth global research center. This is a rare occurrence. The last new lab was established 12 years ago, in India. (Other research facilities are in the United States, Switzerland, Israel, China, and Japan.) “A research lab is something special,” says Robert Morris, a vice-president at IBM Research. “We have about 400,000 employees worldwide, of which about 3000 are in research. This is the tip of the arrow.”

Kelly and his colleagues chose Brazil because of its large and high-caliber talent pool, its excellent universities, and the size and growth rate of the IT market. Brazil’s technology priorities also align closely with our Smarter Planet agenda. The lab will focus initially on three areas of research 1) Improving natural resource discovery, exploration, and logistics; 2) Developing smarter devices for use in sensor networks; and 3) Producing technologies for managing large-scale events–including huge gatherings of people such as the World Cup and Olympic Games, which Brazil will host in 2014 and 2016, respectively. Brazil’s central government is co-investing along with IBM.

The natural resource project is particularly intriguing–especially at a time when the world is watching closely the response to the massive oil spill in the Gulf of Mexico. “We’ll need massive amounts of sophisticated logistics  and simulation technologies to help prevent the accidents of the future,” Morris said.

The new lab has people working in existing IBM offices in Sao Paulo and Rio de Janeiro while we work with governments to choose a permanent location. Eventually we plan to have a staff of more than 100.

Bookmark and Share