By Manoj Saxena
Social and technological shifts are driving rapid change, altering ways in which individuals interact with one another, learn, and attend to their personal and business needs. These shifts offer the potential to strengthen the relationships between companies and their customers—enabling more individual and directed communication and allowing organizations to cater to individual needs. Yet, for many, today’s online customer experiences lack personalization, timeliness and trust.
But what if companies could offer their customers the kind of personalized and knowledgeable assistance when they’re online or on the phone that people have come to expect from top-flight customer service delivered in person? Continue Reading »
By Miles Nosler
Over the last few years, whenever I saw an IBM Smarter Planet commercial on television I wondered what was behind things like Smarter Transportation? Smarter Cities? Smarter Commerce?
Since then I’ve come to understand what the Smarter Planet concept is about – tackling Big issues with smarter, interconnected technologies to improve the way we live and work. But, it didn’t truly sink in until I started crunching some Big Data with an IBM mainframe. Let me explain.
If someone told me I would take the top spot among 4,600 very smart students competing in IBM’s Master the Mainframe contest, I wouldn’t have believed it. But that’s exactly what I did, and now I have in-demand technical skills on my resume that are landing me job interviews. Continue Reading »
By Matthias Kaiserswerth
Steve Jobs famously lured John Sculley from a soda pop company to Apple in 1983 by saying, “Do you want to sell sugared water for the rest of your life? Or do you want to come with me and change the world?” In today’s business environment, the comparable challenge to a young engineer or computer scientist would be: “Do you want to create the next mobile app that makes your friends look like zombies or do you want to help transform the world of computing?”
That, in fact, is the challenge that we’re issuing today. IBM and ASTRON, the Netherlands Institute for Radio Astronomy, have assembled what some call a dream team of scientists to create a next-generation computing system capable of handling the ultimate big data challenge. Our project, called DOME, is a system for handling the deluge of data that will be created by the Square Kilometre Array, a radio telescope made up of more than half a million individual antennas that are to be scattered across southern Africa and Australia. When the SKA is completed in 2024, it is expected to process 14 exabytes of raw data per day. The data collected by the SKA in a single day would take nearly two million years to play back on an iPod.
We’re in the process of recruiting more than a half-dozen PhD.-level students to help staff the project–and we’re staging a virtual job fair to engage prospective employees. If you’re interested and qualified, visit the job fair Web site on March 26 at 5 p.m. Central European Time (Noon US Eastern Time). Only top students with huge ambitions should apply.
By John Kelly
A few weeks ago, I shared a dinner table in Johannesburg with Adrian Tiplady, one of the managers of Square Kilometre Array South Africa, which is managing the country’s involvement in the Square Kilometre Array astronomy project. The SKA is one of the most ambitious science efforts ever launched. The goal of the 10 countries involved is to decipher radio waves from deep space in order to solve the riddles of the universe and the nature of matter. Yet something Adrian told me totally blew my mind: he said the computing challenges posed by the SKA are just as great as those related to astronomy.
It’s gratifying when scientists from other domains come together to push computing and computer science forward. And it’s even more gratifying when organizations like Tiplady’s form partnerships with IBM to bring cutting-edge technologies to bear on the most demanding tasks ever dreamed up by humans. Today, SKA South Africa announced that it has joined IBM and ASTRON, the Netherlands Institute for Radio Astronomy, in a multi-year public-private partnership funded primarily by the Dutch government aimed at developing an information technology system for harvesting insights from the SKA’s data.
By Dr. Lora Ramunno
The study of the interaction between light and matter on the nanoscale (a nanometre is about one billionth of a metre) is revolutionizing many areas of science and technology. Powerful applications can be designed, for example, to capture real time images of live cells, tissues and biological processes or to help manufacture extremely small devices that can be used in diverse areas including telecommunications, computation and biotechnology.
These applications hold the potential to significantly improve early detection of disease or provide a better understanding of biological processes at a cellular level, as well as to identify hidden insights that can help companies move into newer and smarter manufacturing in the high technology market. Continue Reading »
By Dr. James Hendler
Every single student in the Department of Computer Science here at Rensselaer Polytechnic Institute has the potential to revolutionize computing. But with the arrival of Watson at Rensselaer, they’re even better positioned to do so.
Watson has caused the researchers in my field of artificial intelligence (AI) to rethink some of our basic assumptions. Watson’s cognitive computing is a breakthrough technology, and it’s really amazing to be here at Rensselaer, where we will be the first university to get our hands on this amazing system.
With 90 percent of the world’s data generated in the past two years, the ability for people and even traditional computing systems to make sense of this data has grown complex. The addition of Watson to our campus is very timely considering the growth of what some have termed “Big Data.”
In 1976, Joseph Weizenbaum, a leading computer scientist, wrote a book called Computer Power and Human Reason: From Judgement to Calculation, in which he criticized the field of AI for trying to replace human creativity and thought with the power of computers. He suggested that humans and computers were inherently different, and that trying to get computers to think like humans was an insurmountable task, if it was possible at all. Continue Reading »
By Dr. Tom Corr
High performance computing was once the domain of big corporations, governments and universities. But not anymore. Global economic pressures to innovate and compete are intense, and small and medium-sized businesses (SMBs or SMEs), recognized as economic powerhouses around the world are being ushered into the world of big data.
Thanks to an innovative and unprecedented partnership between Ontario Centres of Excellence, IBM, seven Ontario Universities, the Province of Ontario and FedDev, high performance computing (HPC) resources and technical expertise are now available to small-to-medium sized enterprises (SME) in Southern Ontario – businesses that are looking to expand their research capabilities.
Today we are pleased to announce that an additional 31 research projects have been added to this portfolio, enabling more than 20 Ontario SMEs to participate in this truly advantageous partnership. Continue Reading »
By Hon. Gary Goodyear
Southern Ontario is a diverse region, containing some of Canada’s largest urban centres surrounded by vibrant rural communities and bordered by one of the world’s largest economies. From our technology triangle to golden horseshoe, from manufacturing heartland to transport corridor, Canada’s most populous region is home to a dynamic business sector and world-class post-secondary and research institutions.
So, when our government was approached by the University of Toronto and Western University, as well as five other post-secondary institutions, to establish a research and commercialization partnership designed to answer some of the world’s most complex issues, we listened to what they had to say. Continue Reading »
By David Turek
A major challenge in cardiology is to predict who will die suddenly from ventricular arrhythmias – the most common cause of sudden cardiac death, which itself is the largest cause of natural death in the U.S.
Despite years of intense medical research, likely victims are hard to predict and even if identified, there are not effective and low-cost therapies available.
Mathematical models have the potential to provide insight into the mechanics of arrhythmias and sudden cardiac death, but we’ve never had the computational power necessary to make a model run even close to the speed of a real beating heart. Instead, researchers have been forced to work at low resolution, settle for short run times of – at most – a few beats, or take hours for a single heart beat.
By Richard Silberman, Writer/Researcher, IBM Communications
The next time an avian flu scare strikes — as it did in 2004 and likely will again — the world may be better prepared thanks to the work of Ruhong Zhou, research staff scientist and manager of the Soft Matter Theory and Simulation Group at IBM’s Thomas J. Watson Research Center.
Zhou and his team have been using an IBM Blue Gene supercomputer to anticipate genetic changes in the H5N1 influenza virus (commonly known as avian or bird flu) that might pose a serious threat to human health. Although H5N1 rarely infects the human population, when it does it has an extremely high mortality rate.
In a recent breakthrough, Zhou was able to computationally identify the single mutation in H5N1 that, should it occur, would debilitate antibodies in our immune system from fighting off this deadly flu. Armed with this information, pharmaceutical companies could design a vaccine that would compensate for this mutation and allow people to develop the necessary antibodies to combat H5N1 if they contract it.
“By isolating and anticipating this mutation, we can be proactive in creating a vaccine before the next avian flu outbreak strikes — potentially saving lives and even helping prevent a global pandemic,” Zhou said.
Taking the guesswork out of vaccine design
Influenza can undergo various mutations over a short time period, so trying to predict exactly how a flu strain will mutate next is the first step in vaccine development. It is too costly and time-intensive, however, to do this type of upfront research by trial and error in a traditional lab setting, so Zhou uses computer simulations to do his work.
Blue Gene provides the computational power to rapidly and efficiently simulate mutations at the atomic level so scientists can now predict a mutation with great accuracy and take much of the guesswork out of vaccine design.
Zhou simulated over 100 single and double mutations of H5N1’s hemagglutinin (HA) protein on Blue Gene in order to pinpoint the single, antibody-suppressing mutation he sought. Using all of Blue Gene’s 8,000 processors, it took two days to model each mutation. By comparison, it would take 8,000 days — or 22 years — to run each model on a laptop or desktop computer with a dual CPU.
“We could have never done our research without Blue Gene,” said Zhou, who has a Ph.D. in chemistry from Columbia University, where he currently teaches graduate level courses. “High performance computing of this sort is enabling a new era of breakthroughs in life science and holds great promise for advances in personalized medicine as well.”
A proactive approach to preventing pandemics
For Zhou, who recently published his findings in Biophysical Journal, this breakthrough is particularly meaningful because of the real promise it holds for public health.
“As scientists, we often do some basic research just for our own curiosity — and achieving the results is gratification enough,” Zhou said. “But this is not just for our own interest; this is something very, very important to human society.”
Along with his avian flu research, Zhou has been using Blue Gene for the past six years to model genetic variations and predict mutations in other influenza strains, including swine flu (H1N1) and Hong Kong flu (H3N2). Zhou hopes the ability to anticipate mutations will prompt the medical community to start preparing preemptive vaccines well ahead of flu outbreaks, rather than responding after the fact (and after lives have been lost), which is the usual practice.
“We need to move from a reactive model of vaccine development to a proactive one,” Zhou said. “Our ability to accurately predict what mutations will happen next should give pharmaceutical companies the confidence to invest in vaccine production early enough to mount a strong defense against a virus and prevent a pandemic.”
Partnerships with government agencies like the Centers for Disease Control (CDC) and with pharmaceutical companies that want to use Zhou’s research to guide vaccine design are essential to realizing the full potential of Zhou’s work.
“With the right funding model and partnerships we can continue to explore influenza strains as well as other infectious diseases, such as HIV,” Zhou said. “I firmly believe that together we can develop better vaccines that will have a profound impact on society’s health and well-being.”