By Dr. Tom Corr
High performance computing was once the domain of big corporations, governments and universities. But not anymore. Global economic pressures to innovate and compete are intense, and small and medium-sized businesses (SMBs or SMEs), recognized as economic powerhouses around the world are being ushered into the world of big data.
Thanks to an innovative and unprecedented partnership between Ontario Centres of Excellence, IBM, seven Ontario Universities, the Province of Ontario and FedDev, high performance computing (HPC) resources and technical expertise are now available to small-to-medium sized enterprises (SME) in Southern Ontario – businesses that are looking to expand their research capabilities.
Today we are pleased to announce that an additional 31 research projects have been added to this portfolio, enabling more than 20 Ontario SMEs to participate in this truly advantageous partnership. Continue Reading »
By Hon. Gary Goodyear
Southern Ontario is a diverse region, containing some of Canada’s largest urban centres surrounded by vibrant rural communities and bordered by one of the world’s largest economies. From our technology triangle to golden horseshoe, from manufacturing heartland to transport corridor, Canada’s most populous region is home to a dynamic business sector and world-class post-secondary and research institutions.
So, when our government was approached by the University of Toronto and Western University, as well as five other post-secondary institutions, to establish a research and commercialization partnership designed to answer some of the world’s most complex issues, we listened to what they had to say. Continue Reading »
By David Turek
A major challenge in cardiology is to predict who will die suddenly from ventricular arrhythmias – the most common cause of sudden cardiac death, which itself is the largest cause of natural death in the U.S.
Despite years of intense medical research, likely victims are hard to predict and even if identified, there are not effective and low-cost therapies available.
Mathematical models have the potential to provide insight into the mechanics of arrhythmias and sudden cardiac death, but we’ve never had the computational power necessary to make a model run even close to the speed of a real beating heart. Instead, researchers have been forced to work at low resolution, settle for short run times of – at most – a few beats, or take hours for a single heart beat.
By Richard Silberman, Writer/Researcher, IBM Communications
The next time an avian flu scare strikes — as it did in 2004 and likely will again — the world may be better prepared thanks to the work of Ruhong Zhou, research staff scientist and manager of the Soft Matter Theory and Simulation Group at IBM’s Thomas J. Watson Research Center.
Zhou and his team have been using an IBM Blue Gene supercomputer to anticipate genetic changes in the H5N1 influenza virus (commonly known as avian or bird flu) that might pose a serious threat to human health. Although H5N1 rarely infects the human population, when it does it has an extremely high mortality rate.
In a recent breakthrough, Zhou was able to computationally identify the single mutation in H5N1 that, should it occur, would debilitate antibodies in our immune system from fighting off this deadly flu. Armed with this information, pharmaceutical companies could design a vaccine that would compensate for this mutation and allow people to develop the necessary antibodies to combat H5N1 if they contract it.
“By isolating and anticipating this mutation, we can be proactive in creating a vaccine before the next avian flu outbreak strikes — potentially saving lives and even helping prevent a global pandemic,” Zhou said.
Taking the guesswork out of vaccine design
Influenza can undergo various mutations over a short time period, so trying to predict exactly how a flu strain will mutate next is the first step in vaccine development. It is too costly and time-intensive, however, to do this type of upfront research by trial and error in a traditional lab setting, so Zhou uses computer simulations to do his work.
Blue Gene provides the computational power to rapidly and efficiently simulate mutations at the atomic level so scientists can now predict a mutation with great accuracy and take much of the guesswork out of vaccine design.
Zhou simulated over 100 single and double mutations of H5N1’s hemagglutinin (HA) protein on Blue Gene in order to pinpoint the single, antibody-suppressing mutation he sought. Using all of Blue Gene’s 8,000 processors, it took two days to model each mutation. By comparison, it would take 8,000 days — or 22 years — to run each model on a laptop or desktop computer with a dual CPU.
“We could have never done our research without Blue Gene,” said Zhou, who has a Ph.D. in chemistry from Columbia University, where he currently teaches graduate level courses. “High performance computing of this sort is enabling a new era of breakthroughs in life science and holds great promise for advances in personalized medicine as well.”
A proactive approach to preventing pandemics
For Zhou, who recently published his findings in Biophysical Journal, this breakthrough is particularly meaningful because of the real promise it holds for public health.
“As scientists, we often do some basic research just for our own curiosity — and achieving the results is gratification enough,” Zhou said. “But this is not just for our own interest; this is something very, very important to human society.”
Along with his avian flu research, Zhou has been using Blue Gene for the past six years to model genetic variations and predict mutations in other influenza strains, including swine flu (H1N1) and Hong Kong flu (H3N2). Zhou hopes the ability to anticipate mutations will prompt the medical community to start preparing preemptive vaccines well ahead of flu outbreaks, rather than responding after the fact (and after lives have been lost), which is the usual practice.
“We need to move from a reactive model of vaccine development to a proactive one,” Zhou said. “Our ability to accurately predict what mutations will happen next should give pharmaceutical companies the confidence to invest in vaccine production early enough to mount a strong defense against a virus and prevent a pandemic.”
Partnerships with government agencies like the Centers for Disease Control (CDC) and with pharmaceutical companies that want to use Zhou’s research to guide vaccine design are essential to realizing the full potential of Zhou’s work.
“With the right funding model and partnerships we can continue to explore influenza strains as well as other infectious diseases, such as HIV,” Zhou said. “I firmly believe that together we can develop better vaccines that will have a profound impact on society’s health and well-being.”
The dog days of summer are upon us, and while for some of us, that means more care-free days of fun in the sun, many of us associate the summer months with severe and sometimes unusual weather events. These events pose increasing economic and societal impacts, and the challenges can leave us wondering what other weather-related surprises could be around the corner.
According to a recent report released by NOAA, the 12-month period from July 2011 to June 2012 was the warmest on record for the contiguous United States (since record keeping began in 1895). To make matters worse, these blistering heat waves are expected to increase due to a changing climate. In a recent publication, a group of Stanford University scientists made the prediction that certain regions of the world could start to see “permanently” hotter summers in just a couple decades.
It is important to note that a warmer atmosphere contains more energy and can hold more moisture, which can lead to greater volatility and intensity of extreme weather events. Hence, it may be no coincidence that each day seems to bring a new headline about another weather-related disturbance. These events are to blame for countless fatalities and billions of dollars annually in property damage and loss.
Deep Thunder, IBM’s high resolution weather modeling technology, provides the granularity needed to improve preparedness. Using both historical and near real-time data, sophisticated analytics software and ever more powerful supercomputers, we can get extremely accurate weather forecasts and the impacts of severe events for specific locations (less than a mile) up to three days in advance.
Just recently, a rare type of severe windstorm for the eastern half of the US called a “derecho” blew through the Midwest and Mid-Atlantic areas. Not only was it unexpected by local authorities, it left millions to deal with sweltering temperatures without power or air conditioning.
Of course, such high temperatures over long periods have been linked to drought. For this year alone, that probably has an impact on the US economy of tens of billions of dollars. But such heat waves also have impact on the health of individuals and increase the likelihood of wildfires. So far in 2012, fires have burned about 2.4 million acres, according to the National Interagency Fire Center and the outlook for the remainder of the summer is grim.
And while we haven’t been hit with any major hurricanes so far this year, we are just six weeks into a six-month long hurricane season. Only twice before since such records have been kept (1887 and 1908), have two tropical storms formed before the official start of the season on June 1. Further, 2012 is the first year that four named storms occurred by June 23. This is the highest level of tropical storm activity based upon strength and duration since 1968. Therefore, we could have some significant events later this season.
In an ideal world, we would be better prepared for the potential impact of extreme weather events. Traditional weather forecasts provide a look into broad weather patterns, but given the technology that is available to us today, we can do even better.
Deep Thunder can provide longer advance notice of adverse weather conditions, allowing more time for disaster prevention. Rather than monitor a storm, we can stage resources at the right place and time prior to an event to minimize the impact and save lives. For example, we could have provided a detailed, 18-hour warning for the derecho that impacted Maryland, Virginia and the District of Columbia late last month.
The practical applications for such specific and timely forecasts are nearly limitless. Detailed weather forecasts could help utilities better prepare for demands on the grid such as pockets of high load during a heat wave, and anticipate conditions that could result in power outages to proactively deploy repair crews. Fire fighters can anticipate the direction of a wild fire to prevent further spreading. Cities can plan more effective responses to heat waves to protect their citizens from extreme conditions such as loss of power or mitigate the impact of flooding. Highway patrols can even anticipate buildup and redirect traffic in the case of evacuations. In all of these cases, Deep Thunder uses an approach of coupled modeling, driven by advanced weather modeling focused on a specific area of importance and connected to sophisticated techniques, to predict and visualize the impacts of weather on business and citizens.
The weather affects much of our daily lives — everything from sports to produce prices — and although we don’t have the technology to change it, at the very least, we can better plan for it.
We live in an increasingly interconnected world where information, goods and people flow between geographical regions with unprecedented porosity. As what is essentially a packet of biochemical information, viruses are no different; with today’s ubiquity of cross-border transportation, their transmission can take place faster over a greater area than ever before. Unlocking the mechanisms of these viruses is of growing importance for both human wellbeing and our global connectivity.
Even an illness like the common cold has widespread health and social impacts. Indeed, the Human Rhinovirus (HRV), the most frequent cause of colds, is believed to exacerbate asthma in about 70 percent of cases; and in Australia alone, the common cold costs employers around 1.5 million workdays, or $600m in lost productivity per year.
Yet despite selling more than $250m worth of remedies in Australia every year, we still know relatively little about the viruses responsible. By applying high performance computing (HPC) to antiviral research, we hope to not only devise more effective treatments but also set a new benchmark for understanding diseases.
Poland is one of the fastest-growing economies in Europe right now, and business and government leaders are determined to stimulate growth through innovation. ICM, a research institute affiliated with the University of Warsaw, does its own research in everything from weather prediction to quantum computing but also provides computational power for other researchers throughout Poland. Here’s how ICM works:
By Richard Silberman, Writer/Researcher, IBM Communications
As a medical student in a large public hospital in New York City, Basit Chaudhry, M.D., first experienced one of the most vexing problems facing doctors today: How do you discover and deal with all the information that’s required to provide optimal care?
“So much of what doctors do today is about trying to figure out how to collect and aggregate all the necessary medical data,” Dr. Chaudhry said. “As I went further along in my training and practice it became more and more apparent to me that if we don’t solve this problem, it’s going to be difficult to build a better, more humane healthcare system.” Continue Reading »
Just over 50 years ago, on February 20, US astronaut John Glenn blasted into space in his tiny Friendship 7 capsule. His three quick trips around the Earth made him the first American to orbit the planet.
A team of more than 70 IBMers headed by Arthur Cohen as manager of the IBM Space Computing Center in Washington, D.C., had developed the computing systems to manage the launch, orbit and reentry for NASA’s Mercury program. IBM systems manager Saul Gass watched the launch from a grandstand at Cape Canaveral . “Think about the time, 1962. This had never been done before” says Gass, who is professor emeritus at the University of Maryland. ” There was a man in the loop whose life depended on our calculations. It was a demonstration of real-time computing.”
Beginning in the mid-1940s and continuing after the Glenn flight, IBM’s scientists and engineers have contributed substantially to astronomy and manned space exploration, but, today, they’re entering an exciting new phase of discovery. IBM scientists in Zurich, Switzerland, and the Netherlands are working with the Netherlands Institute for Radio Astronomy (ASTRON) to develop a massively powerful computing system for harvesting a huge quantity of data gathered by the international Square Kilometre Array (SKA) radio telescope.
The project demonstrates once again the belief that major advances in human achievement and knowledge come through a combination of big bets and bold scientific inquiry.
By Richard Silberman, Writer/Researcher, IBM Communications
When Jason Hlady sees a computer that is turned on but not being used, just sitting there, idling away, he can’t help but think of the possibilities…
That dormant machine could, at that very moment, be running computations to help cure cancer or fight AIDS. It could be solving algorithms that might lead to clean water solutions, or reduce world hunger, or accelerate any number of other worthy research projects.
Hlady, a high performance computing coordinator at the University of Saskatchewan, wants to cut waste and tap the potential of idle computers across the university. To that end, he is leading the drive to get faculty and staff to connect to the World Community Grid — a global network that pools unused computing power and repurposes it for humanitarian research.
As leader of the university’s World Community Grid team, Hlady encourages colleagues to install software that connects their computers to the grid and runs research computations on the machines when they are on, but idle.
“When a computer sits idle, all that energy is just going up a smokestack,” Hlady said. “By joining the World Community Grid, we’re able to put otherwise wasted computing power to good use, helping solve some of the major problems facing our world today.”