The dog days of summer are upon us, and while for some of us, that means more care-free days of fun in the sun, many of us associate the summer months with severe and sometimes unusual weather events. These events pose increasing economic and societal impacts, and the challenges can leave us wondering what other weather-related surprises could be around the corner.
According to a recent report released by NOAA, the 12-month period from July 2011 to June 2012 was the warmest on record for the contiguous United States (since record keeping began in 1895). To make matters worse, these blistering heat waves are expected to increase due to a changing climate. In a recent publication, a group of Stanford University scientists made the prediction that certain regions of the world could start to see “permanently” hotter summers in just a couple decades.
It is important to note that a warmer atmosphere contains more energy and can hold more moisture, which can lead to greater volatility and intensity of extreme weather events. Hence, it may be no coincidence that each day seems to bring a new headline about another weather-related disturbance. These events are to blame for countless fatalities and billions of dollars annually in property damage and loss.
Deep Thunder, IBM’s high resolution weather modeling technology, provides the granularity needed to improve preparedness. Using both historical and near real-time data, sophisticated analytics software and ever more powerful supercomputers, we can get extremely accurate weather forecasts and the impacts of severe events for specific locations (less than a mile) up to three days in advance.
Just recently, a rare type of severe windstorm for the eastern half of the US called a “derecho” blew through the Midwest and Mid-Atlantic areas. Not only was it unexpected by local authorities, it left millions to deal with sweltering temperatures without power or air conditioning.
Of course, such high temperatures over long periods have been linked to drought. For this year alone, that probably has an impact on the US economy of tens of billions of dollars. But such heat waves also have impact on the health of individuals and increase the likelihood of wildfires. So far in 2012, fires have burned about 2.4 million acres, according to the National Interagency Fire Center and the outlook for the remainder of the summer is grim.
And while we haven’t been hit with any major hurricanes so far this year, we are just six weeks into a six-month long hurricane season. Only twice before since such records have been kept (1887 and 1908), have two tropical storms formed before the official start of the season on June 1. Further, 2012 is the first year that four named storms occurred by June 23. This is the highest level of tropical storm activity based upon strength and duration since 1968. Therefore, we could have some significant events later this season.
In an ideal world, we would be better prepared for the potential impact of extreme weather events. Traditional weather forecasts provide a look into broad weather patterns, but given the technology that is available to us today, we can do even better.
Deep Thunder can provide longer advance notice of adverse weather conditions, allowing more time for disaster prevention. Rather than monitor a storm, we can stage resources at the right place and time prior to an event to minimize the impact and save lives. For example, we could have provided a detailed, 18-hour warning for the derecho that impacted Maryland, Virginia and the District of Columbia late last month.
The practical applications for such specific and timely forecasts are nearly limitless. Detailed weather forecasts could help utilities better prepare for demands on the grid such as pockets of high load during a heat wave, and anticipate conditions that could result in power outages to proactively deploy repair crews. Fire fighters can anticipate the direction of a wild fire to prevent further spreading. Cities can plan more effective responses to heat waves to protect their citizens from extreme conditions such as loss of power or mitigate the impact of flooding. Highway patrols can even anticipate buildup and redirect traffic in the case of evacuations. In all of these cases, Deep Thunder uses an approach of coupled modeling, driven by advanced weather modeling focused on a specific area of importance and connected to sophisticated techniques, to predict and visualize the impacts of weather on business and citizens.
The weather affects much of our daily lives — everything from sports to produce prices — and although we don’t have the technology to change it, at the very least, we can better plan for it.
We live in an increasingly interconnected world where information, goods and people flow between geographical regions with unprecedented porosity. As what is essentially a packet of biochemical information, viruses are no different; with today’s ubiquity of cross-border transportation, their transmission can take place faster over a greater area than ever before. Unlocking the mechanisms of these viruses is of growing importance for both human wellbeing and our global connectivity.
Even an illness like the common cold has widespread health and social impacts. Indeed, the Human Rhinovirus (HRV), the most frequent cause of colds, is believed to exacerbate asthma in about 70 percent of cases; and in Australia alone, the common cold costs employers around 1.5 million workdays, or $600m in lost productivity per year.
Yet despite selling more than $250m worth of remedies in Australia every year, we still know relatively little about the viruses responsible. By applying high performance computing (HPC) to antiviral research, we hope to not only devise more effective treatments but also set a new benchmark for understanding diseases.
Poland is one of the fastest-growing economies in Europe right now, and business and government leaders are determined to stimulate growth through innovation. ICM, a research institute affiliated with the University of Warsaw, does its own research in everything from weather prediction to quantum computing but also provides computational power for other researchers throughout Poland. Here’s how ICM works:
By Richard Silberman, Writer/Researcher, IBM Communications
As a medical student in a large public hospital in New York City, Basit Chaudhry, M.D., first experienced one of the most vexing problems facing doctors today: How do you discover and deal with all the information that’s required to provide optimal care?
“So much of what doctors do today is about trying to figure out how to collect and aggregate all the necessary medical data,” Dr. Chaudhry said. “As I went further along in my training and practice it became more and more apparent to me that if we don’t solve this problem, it’s going to be difficult to build a better, more humane healthcare system.” Continue Reading »
Just over 50 years ago, on February 20, US astronaut John Glenn blasted into space in his tiny Friendship 7 capsule. His three quick trips around the Earth made him the first American to orbit the planet.
A team of more than 70 IBMers headed by Arthur Cohen as manager of the IBM Space Computing Center in Washington, D.C., had developed the computing systems to manage the launch, orbit and reentry for NASA’s Mercury program. IBM systems manager Saul Gass watched the launch from a grandstand at Cape Canaveral . “Think about the time, 1962. This had never been done before” says Gass, who is professor emeritus at the University of Maryland. ” There was a man in the loop whose life depended on our calculations. It was a demonstration of real-time computing.”
Beginning in the mid-1940s and continuing after the Glenn flight, IBM’s scientists and engineers have contributed substantially to astronomy and manned space exploration, but, today, they’re entering an exciting new phase of discovery. IBM scientists in Zurich, Switzerland, and the Netherlands are working with the Netherlands Institute for Radio Astronomy (ASTRON) to develop a massively powerful computing system for harvesting a huge quantity of data gathered by the international Square Kilometre Array (SKA) radio telescope.
The project demonstrates once again the belief that major advances in human achievement and knowledge come through a combination of big bets and bold scientific inquiry.
By Richard Silberman, Writer/Researcher, IBM Communications
When Jason Hlady sees a computer that is turned on but not being used, just sitting there, idling away, he can’t help but think of the possibilities…
That dormant machine could, at that very moment, be running computations to help cure cancer or fight AIDS. It could be solving algorithms that might lead to clean water solutions, or reduce world hunger, or accelerate any number of other worthy research projects.
Hlady, a high performance computing coordinator at the University of Saskatchewan, wants to cut waste and tap the potential of idle computers across the university. To that end, he is leading the drive to get faculty and staff to connect to the World Community Grid — a global network that pools unused computing power and repurposes it for humanitarian research.
As leader of the university’s World Community Grid team, Hlady encourages colleagues to install software that connects their computers to the grid and runs research computations on the machines when they are on, but idle.
“When a computer sits idle, all that energy is just going up a smokestack,” Hlady said. “By joining the World Community Grid, we’re able to put otherwise wasted computing power to good use, helping solve some of the major problems facing our world today.”
By Richard Silberman, Writer/Researcher, IBM Communications
When Igor Jurisica started doing cancer research 11 years ago, he worked with about a dozen colleagues using a handful of scientific workstations in a small lab in Toronto, Canada.
How times have changed.
Today, Jurisica, a senior scientist at Princess Margaret Hospital, Ontario Cancer Institute, conducts his research with the help of nearly 300,000 people spread across 100 countries running his calculations on over 900,000 devices. Continue Reading »
When we think of the systems that make up a smarter planet, what typically comes to mind are industries like manufacturing, transportation, energy, or banking. But there is another ‘industry’ that needs to become smarter. We might call it the humanitarian industry. That is, the system that creates a safety net to support society and is made up of philanthropies, social services, education organizations, NGOs and government agencies.
In many ways, this is the most human of all systems. So it is ironic to consider how Watson, a computing system, could help us solve civic, social and cultural challenges and make smarter humanitarian decisions. But Watson’s deep QA technology presents new possibilities to do just that. Through private sector collaboration with nonprofits, Watson can become the next innovation to be used as a force for societal good.
I couldn’t help punning in the headline, but this new supercomputer at the U.S. Argonne Lab has only a little in common with psychologist Wilhelm Reich’s supposedly energy-gathering orgone box. The new computer, called Mira, which Argonne announced earlier this week, will be used to help the Department of Energy identify new materials and chemistry that could improve national competitiveness.
Mira, when she’s installed next year, will be a 10-petaflop computer–meaning she’ll be capable of performing 10 quadrillion calculations a second. Argonne already has a supercomputer, Intrepid, based on an earlier version of IBM’s Blue Gene technology. It’s a half-petaflop machine. Here’s a comparison that gives you a good idea of what a lot more processing power can do for scientists: Using the current generation of supercomputers operating worldwide, it takes about two years to run a simulation of how a human heart reacts to a new medicine. A 10-petaflop system would cut the wait time down to two days.
Argonne researcher Larry Curtiss hopes all this added computing horsepower will help him develop new materials that could stretch the miles traveled per charge on an electric car battery up to 500 miles, making EVs practical for many uses. “The new computer will help us create the next generation of batteries and make the United States more competitive,” says Curtiss.
Editor’s note: The following is a guest post from Dr. David Ferrucci, Principal Investigator, DeepQA/Watson, IBM
Just as IBM set its sights on defeating a chess Grandmaster with Deep Blue in 1997, the company’s scientists have developed a Natural Language Processing, Question Answer machine, named Watson (after company founder Thomas J. Watson, Sr.), to challenge two of the world’s trivia grand masters, to be aired on U.S. television from February 14-16, 2011.
Win or lose on national television, Watson will answer the immediate questions, “does it answer questions accurately?” and “does it answer questions quickly?” with a resounding “yes.”
Beyond excitement for the match itself, the team of IBM scientists is motivated by the possibilities that Watson’s breakthrough computing capabilities hold for building a smarter planet and helping people in their business tasks and personal lives. Watson’s ability to understand the meaning and context of human language, and rapidly process information to find precise answers to complex questions, holds enormous potential to transform how computers help people accomplish tasks in business and their personal lives.
Watson will enable people to rapidly find specific answers to complex questions. The technology could be applied in areas such as healthcare, for accurately diagnosing patients, to improve online self-service help desks, to provide tourists and citizens with specific information regarding cities, prompt customer support via phone, and much more.
Like Deep Blue, Watson represents a major leap in the capacity of information technology systems to identify patterns, gain critical insight and enhance decision-making despite daunting complexity. But while Deep Blue was an amazing achievement in the application of compute power to a computationally well-defined and well-bounded game, Watson faces a challenge that is open-ended and defies the well-bounded mathematical formulation of a game like Chess. Watson has to operate in the near limitless, ambiguous and highly contextual domain of human language and knowledge.
Watson’s technology furthers IBM’s leadership in analytics solutions, which help organizations use the vast amount of information they collect to improve their business operations and service to their customers. Additionally, Watson harnesses IBM’s commercial POWER7 system, showcasing how IBM workload-optimized systems provide unmatched capabilities for processing thousands of simultaneous tasks at rapid speeds, once the realm of only scientific supercomputers.
Read more about the technology behind Watson at ibmwatson.com.