By Mark Ritter
In 1981, Nobel Prize winner Richard Feynman challenged computer scientists to develop a new breed of computers based on quantum physics. Ever since then, scientists have been grappling with the difficulty of attaining such a grand challenge.
Employing quantum physics for computation is difficult in part because quantum information is very fragile, requiring the quantum elements to be cooled to near absolute zero temperature and shielded from electromagnetic radiation to minimize errors. This is so immensely different than our current approach to computation that the entire infrastructure of computing must be re-imagined and re-engineered.
Still, the challenges haven’t stopped physicists and computer scientists from trying, and an enormous amount of progress is being made. In fact, I believe we’re entering what will come to be seen as the golden age of quantum computing research.
By Anne Altman
While the National Oceanic and Atmospheric Administration (NOAA) was created officially in 1970, its roots go back more than 200 years. The agencies that came together to form NOAA represent some of the oldest federal agencies. So much history, so much research, so much science, so much data…so little time.
Every day, NOAA gathers more than 20 terabytes from Doppler radar systems, weather satellites, buoy networks and stations, tide gauges, real-time weather stations, ships and aircraft. That equates to creating more than twice the data contained in the United States Library of Congress – every day. Yes, data is our greatest natural resource, but like any natural resource, its power is only useful if it can be refined. Continue Reading »
By Jack Wells
Here at the Oak Ridge Leadership Computing Facility (OLCF) in East Tennessee, deploying the next top supercomputer for open science is akin to an ambitious hike in the Smoky Mountains: once one towering crest is reached, the next one appears within sight.
Just 18 months after the OLCF brought Titan—then the fastest supercomputer in the world—to full operation for users in May 2013, we announced a contract with IBM to create the next big machine: Summit.
Summit will expand on Titan’s groundbreaking hybrid architecture to deliver several times the computational power of the 27-petaflop Titan. Continue Reading »
By Maria B. Winans
When we set out to create a fun and educational program that would spotlight millennial start-ups that were founded with a social conscious to improve society, I had no idea we would encounter such intense levels of passion, commitment and clarity of thought.
But that’s exactly what we got.
This week we kick off the second phase of the New Way to Startup competition and webisode series, a five-day accelerator among five young start-ups will compete to see which one can produce the biggest breakthrough for their company using the latest social and analytics tools and leveraging expert business advice from onsite business pros. Continue Reading »
By John E. Kelly III
It’s amazing for me to recall that in 1980 when I came to IBM Research out of graduate school, engineers were striving to design chips containing 100,000 transistors–those tiny electronic switches that process and store data. Today, it’s common to put five or six billion transistors on a sliver of silicon.
This remarkable achievement is the fulfillment of a prediction made in 1965 by industry pioneer Gordon Moore: that the number of components on a chip would double every year for the foreseeable future. He later amended the time period to 24 months. His predictions, codified as Moore’s Law, have come to symbolize the seemingly inevitable march of technological progress–the ability to make all sorts of electronic devices faster, smaller and more energy efficient.
While Gordon’s prediction proved to be more prescient than he could have imagined, today, 50 years later, the chip industry is no longer able to clear the high bar he set, due largely to limits imposed by the laws of physics. To put things bluntly: Moore’s Law is hitting a wall, and that collision holds significant consequences for business and society. Unless scientists and engineers come up with bold new approaches to chip architectures and materials, technological progress will slow.
To accelerate progress, we need to invent the next switch.
By Anjul Bhambhri
It’s estimated that 2.5 quintillion bytes of data are created every day from sources such as email and collaboration tools, including posts to social media sites, digital pictures and videos, and purchase transactions, just to name a few.
As the tools for making sense of Big Data become more widely – and more expertly – applied, and the types of data that are available for analysis diversify, the opportunity to use Big Data for social good intensifies. These massive datasets can be leveraged to better serve both the billions of people who generate the data, and ultimately the societies in which they live. Continue Reading »
By Chris Nay
On average, IBM bestows its top technical rank of Fellow upon only five employees per year. That adds up to 257 who have earned the title over the program’s 52 year history. And those who hold it are recognized not only across IBM, but throughout the industry and around the world for leading innovation that will change the future.
Fellows program founder, Gardiner Tucker, recently recalled an example of the impact and scope of the program when discussing the work of Nathaniel Rochester, Fellow class of 1967. Continue Reading »
By Judy Murphy
One of the most stressful parts of a nurse’s job is the so-called “handover,” which occurs at the beginning of the shift–typically at 7 a.m. or 7 p.m.
In a matter of minutes, they have to find out which patients have been assigned to them, get reports from the nurses who handled those patients during the previous shift, and plan everything for their shift, from administering medications and scheduling procedures to giving baths and doing assessments –all the while being aware of activities that are already on the books for each patient. Talk about multitasking! Continue Reading »
By Kyu Rhee, MD, MPP
There was an interesting decision to make within IBM about what to call a new business organization that we’re announcing today. Should it be named Watson Health or Watson Healthcare?
“Health” is an aspiration, for individuals and society. “Healthcare” describes an industry primarily focused on treating diseases.
While healthcare is essential, it represents just one of many factors that determine whether people live long and healthy lives. Some other critical factors are genetics, geography, behaviors, social/environmental influences, education, and economics. Unless society takes all of these factors into account and puts the individual at the center of the healthcare system, we won’t be able to make large-scale progress in helping people feel better and live longer. So, Watson Health it is. Continue Reading »