By Steve Hamm
Beware the pistol shrimp. It stuns small sea creatures with a gun-like claw that fires powerful clouds of bubbles at its prey. The scientific principle that gives the pistol shrimp its mini-superhero powers could also prove valuable to humans–in uses ranging from improving the designs of propellers to helping doctors destroy kidney stones and cancerous tumors. A global collaboration involving IBM scientists, researchers at two European universities and the US Lawrence Livermore National Laboratory could help accelerate the journey of this science into the marketplace.
The multi-disciplinary team used one of the world’s fastest supercomputers to simulate the behavior of clouds of bursting bubbles–handling the highly-complex fluid dynamics problem in a way that was extremely efficient. In the process, they set a new record in supercomputing in fluid dynamics and, as a result, the team on Nov. 21 won the coveted Gordon Bell Prize from the Association for Computing Machinery.
Alessandro Curioni, head of mathematical and computational sciences at IBM Research – Zurich, described the adrenaline rush of working on the project. The team ran into one problem after another, and it required a diverse set of skills to solve them. The excitement peaked last April when the team–working around the clock for one week–demonstrated their breakthrough on Lawrence Livermore’s Sequoia computer. Scattered over half the globe, they kept in touch constantly with email, telephones and Skype. “A single group could not have accomplished this. We needed a wide variety of skills. It’s a great example of open collaboration,” he says. Continue Reading »
By Manoj Saxena
Social and technological shifts are driving rapid change, altering ways in which individuals interact with one another, learn, and attend to their personal and business needs. These shifts offer the potential to strengthen the relationships between companies and their customers—enabling more individual and directed communication and allowing organizations to cater to individual needs. Yet, for many, today’s online customer experiences lack personalization, timeliness and trust.
But what if companies could offer their customers the kind of personalized and knowledgeable assistance when they’re online or on the phone that people have come to expect from top-flight customer service delivered in person? Continue Reading »
By Miles Nosler
Over the last few years, whenever I saw an IBM Smarter Planet commercial on television I wondered what was behind things like Smarter Transportation? Smarter Cities? Smarter Commerce?
Since then I’ve come to understand what the Smarter Planet concept is about – tackling Big issues with smarter, interconnected technologies to improve the way we live and work. But, it didn’t truly sink in until I started crunching some Big Data with an IBM mainframe. Let me explain.
If someone told me I would take the top spot among 4,600 very smart students competing in IBM’s Master the Mainframe contest, I wouldn’t have believed it. But that’s exactly what I did, and now I have in-demand technical skills on my resume that are landing me job interviews. Continue Reading »
By Matthias Kaiserswerth
Steve Jobs famously lured John Sculley from a soda pop company to Apple in 1983 by saying, “Do you want to sell sugared water for the rest of your life? Or do you want to come with me and change the world?” In today’s business environment, the comparable challenge to a young engineer or computer scientist would be: “Do you want to create the next mobile app that makes your friends look like zombies or do you want to help transform the world of computing?”
That, in fact, is the challenge that we’re issuing today. IBM and ASTRON, the Netherlands Institute for Radio Astronomy, have assembled what some call a dream team of scientists to create a next-generation computing system capable of handling the ultimate big data challenge. Our project, called DOME, is a system for handling the deluge of data that will be created by the Square Kilometre Array, a radio telescope made up of more than half a million individual antennas that are to be scattered across southern Africa and Australia. When the SKA is completed in 2024, it is expected to process 14 exabytes of raw data per day. The data collected by the SKA in a single day would take nearly two million years to play back on an iPod.
We’re in the process of recruiting more than a half-dozen PhD.-level students to help staff the project–and we’re staging a virtual job fair to engage prospective employees. If you’re interested and qualified, visit the job fair Web site on March 26 at 5 p.m. Central European Time (Noon US Eastern Time). Only top students with huge ambitions should apply.
By John Kelly
A few weeks ago, I shared a dinner table in Johannesburg with Adrian Tiplady, one of the managers of Square Kilometre Array South Africa, which is managing the country’s involvement in the Square Kilometre Array astronomy project. The SKA is one of the most ambitious science efforts ever launched. The goal of the 10 countries involved is to decipher radio waves from deep space in order to solve the riddles of the universe and the nature of matter. Yet something Adrian told me totally blew my mind: he said the computing challenges posed by the SKA are just as great as those related to astronomy.
It’s gratifying when scientists from other domains come together to push computing and computer science forward. And it’s even more gratifying when organizations like Tiplady’s form partnerships with IBM to bring cutting-edge technologies to bear on the most demanding tasks ever dreamed up by humans. Today, SKA South Africa announced that it has joined IBM and ASTRON, the Netherlands Institute for Radio Astronomy, in a multi-year public-private partnership funded primarily by the Dutch government aimed at developing an information technology system for harvesting insights from the SKA’s data.
By Dr. Lora Ramunno
The study of the interaction between light and matter on the nanoscale (a nanometre is about one billionth of a metre) is revolutionizing many areas of science and technology. Powerful applications can be designed, for example, to capture real time images of live cells, tissues and biological processes or to help manufacture extremely small devices that can be used in diverse areas including telecommunications, computation and biotechnology.
These applications hold the potential to significantly improve early detection of disease or provide a better understanding of biological processes at a cellular level, as well as to identify hidden insights that can help companies move into newer and smarter manufacturing in the high technology market. Continue Reading »
By Dr. James Hendler
Every single student in the Department of Computer Science here at Rensselaer Polytechnic Institute has the potential to revolutionize computing. But with the arrival of Watson at Rensselaer, they’re even better positioned to do so.
Watson has caused the researchers in my field of artificial intelligence (AI) to rethink some of our basic assumptions. Watson’s cognitive computing is a breakthrough technology, and it’s really amazing to be here at Rensselaer, where we will be the first university to get our hands on this amazing system.
With 90 percent of the world’s data generated in the past two years, the ability for people and even traditional computing systems to make sense of this data has grown complex. The addition of Watson to our campus is very timely considering the growth of what some have termed “Big Data.”
In 1976, Joseph Weizenbaum, a leading computer scientist, wrote a book called Computer Power and Human Reason: From Judgement to Calculation, in which he criticized the field of AI for trying to replace human creativity and thought with the power of computers. He suggested that humans and computers were inherently different, and that trying to get computers to think like humans was an insurmountable task, if it was possible at all. Continue Reading »
By Dr. Tom Corr
High performance computing was once the domain of big corporations, governments and universities. But not anymore. Global economic pressures to innovate and compete are intense, and small and medium-sized businesses (SMBs or SMEs), recognized as economic powerhouses around the world are being ushered into the world of big data.
Thanks to an innovative and unprecedented partnership between Ontario Centres of Excellence, IBM, seven Ontario Universities, the Province of Ontario and FedDev, high performance computing (HPC) resources and technical expertise are now available to small-to-medium sized enterprises (SME) in Southern Ontario – businesses that are looking to expand their research capabilities.
Today we are pleased to announce that an additional 31 research projects have been added to this portfolio, enabling more than 20 Ontario SMEs to participate in this truly advantageous partnership. Continue Reading »
By Hon. Gary Goodyear
Southern Ontario is a diverse region, containing some of Canada’s largest urban centres surrounded by vibrant rural communities and bordered by one of the world’s largest economies. From our technology triangle to golden horseshoe, from manufacturing heartland to transport corridor, Canada’s most populous region is home to a dynamic business sector and world-class post-secondary and research institutions.
So, when our government was approached by the University of Toronto and Western University, as well as five other post-secondary institutions, to establish a research and commercialization partnership designed to answer some of the world’s most complex issues, we listened to what they had to say. Continue Reading »
By David Turek
A major challenge in cardiology is to predict who will die suddenly from ventricular arrhythmias – the most common cause of sudden cardiac death, which itself is the largest cause of natural death in the U.S.
Despite years of intense medical research, likely victims are hard to predict and even if identified, there are not effective and low-cost therapies available.
Mathematical models have the potential to provide insight into the mechanics of arrhythmias and sudden cardiac death, but we’ve never had the computational power necessary to make a model run even close to the speed of a real beating heart. Instead, researchers have been forced to work at low resolution, settle for short run times of – at most – a few beats, or take hours for a single heart beat.