By Dr. John E. Kelly III
This week, President Obama issued an executive order establishing the National Strategic Computing Initiative with the goal of ensuring that the United States leads in the field of high-performance computing. The initiative is aimed at producing computers capable of exascale performance–which is one billion billion operations per second, orders of magnitude faster than today’s most powerful computers.
IBM has been a pacesetter in large-scale computing ever since modern computers emerged in the 1940s. We have collaborated with the US government in producing and deploying computers in the national laboratories and government agencies that help the country retain its leadership in science and commerce, as well as safeguarding national security.
We believe it won’t be possible to achieve exascale performance in a way that is affordable and sustainable by following the path that computer scientists have been on for decades. Instead, it’s necessary to develop a bold new approach, which we call data centric computing, which addresses both the modeling and simulation applications that are the traditional focus of the high-performance computing community and today’s new applications in big data analytics and cognitive computing.
By Dr. John Kelly III
World leaders from business, government and the non-profit sector are gathering this week in Nairobi, Kenya, for Global Entrepreneur Summit 2015, the first such summit to be held in sub-Saharan Africa. So it’s a good time to explore the potential for Africa and Africans to take advantage of the power of entrepreneurship and innovation to propel the continent forward.
IBM is committed to helping Africa fulfill it’s promise by providing information technologies to help address the continent’s challenges, through research collaborations with companies and universities, and by helping to foster innovation ecosystems in a number of cities. Continue Reading »
By Guruduth Banavar
With thousands of scientists, engineers, and business leaders focused on cognitive computing across IBM Research and the IBM Watson Group, IBM is pursuing the most comprehensive effort in the tech industry to advance into the new era of computing. Nobody has more people on it, a broader array of research and development projects nor deeper expertise in so many of the most significant fields of inquiry.
Yet we understand that to accelerate progress in cognitive computing, we can’t do this alone. That’s why IBM has been pursuing a strategy of forming deep collaborative partnerships with academic scientists who are among the leaders in their fields as well as opening Watson as a technology platform for others to build on. Continue Reading »
Humans have long dreamed of creating machines that think. More than 100 years before the first programmable computer was built, inventors wondered whether devices made of rods and gears might become intelligent. And when Alan Turing, one of the pioneers of computing in the 1940s, set a goal for computer science, he described a test, later dubbed the Turing Test, which measured a computer’s performance against the behavior of humans.
In the early days of my academic field, artificial intelligence, scientists tackled problems that were difficult for humans but relatively easy for computers–such as large-scale mathematical calculations. In more recent years, we’re taking on tasks that are easy for people to perform but hard to describe to a machine–tasks humans solve “without thinking,” such as recognizing spoken words or faces in a crowd. Continue Reading »
By Steve Hamm
Chief Storyteller, IBM
Wendy Hite is a bit of a food snob. She grew up in South West Louisiana, where food and family are all mixed up in the great gumbo of life, and, for the longest time, she couldn’t imagine how she could improve on traditional Cajun-style cooking.
Until she met Chef Watson, that is.
She used the cognitive cooking discovery program to develop a crawfish deviled egg dish that was mighty tasty–familiar, in some ways, but also new to her. “This has been fun,” she says. “It gets you to try new things and to be more creative than you normally would be.” Continue Reading »
By Arvind Krishna
Chemists at Unilever, the Anglo-Dutch consumer products giant, used to spend up to three months in their laboratories creating new formulations for liquid cleaning products. Now, they can perform the same work in 45 minutes or less–thanks to a collaboration between Unilever, one of the United Kingdom’s national laboratories and IBM.
Unilever product developers use iPads to set up tests and experiments, run simulations on an IBM Blue Gene/Q supercomputer at the UK’s Hartree Centre lab, and see their results in 3D visualizations that help them explore the data and make discoveries that otherwise might elude them.
This is an example of what’s possible when government, businesses and tech companies combine forces to bring the power of supercomputing and sophisticated data analytics to bear on business problems. It’s also an example of the kind of collaboration I expect to see flourish as a result of an agreement IBM is announcing today with Britain’s Science & Technology Facility Council.
Social sharing, mobile computing and the Internet of Things have made data compression a part of our every day lives. The process of compressing data is put to work every time a photo or video is shared across social media or a weather sensor reports a temperature change. Continue Reading »
By Jeffrey Coveyduc and Emily McManus
Imagine being able to ask a panel of TED speakers: Will having more money make me happy? Will new innovations give me a longer life? A new technology from IBM Watson is set to help people explore the ideas inside TED Talks videos by asking the questions that matter to them, in natural language.
Users will be able to search the entire TED Talks library by asking questions. Then they’ll be offered segments from a variety of videos where their concepts are discussed. Below each clip is a timeline that shows more concepts that Watson found within the talk, so that users can “tunnel sideways” to view material that’s contextually related, allowing a kind of serendipitous exploration.
Today, IBM and TED are showing a demo of the technology at World of Watson, an IBM symposium in Brooklyn, New York, aimed at expanding the role of cognitive computing in society.
By Dr. Lukas Wartman
I have the dubious distinction of being a famous cancer patient. I’m an oncologist who specializes in leukemia; I got leukemia; and I’m cured, at least for now, thanks to advances in genomic medicine and the efforts of some brilliant physicians and researchers.
My health was broken. It took some of the best minds and science in the world to put me back together again.
Unfortunately, in spite of advances in gene sequencing and oncology, too few cancer victims have outcomes like mine. The genomic treatment I received, an example of precision medicine, simply isn’t scalable to millions of people right now.
This is where IBM Watson could help. Using Watson’s cognitive computing capabilities, I hope it will be possible for oncologists like me to quickly mine insights from the immense amount of genomic data that’s becoming available about individual patients by using Watson to identify potential drugs that target our patients’ specific genetic profiles.
By John E. Kelly III
It’s amazing for me to recall that in 1980 when I came to IBM Research out of graduate school, engineers were striving to design chips containing 100,000 transistors–those tiny electronic switches that process and store data. Today, it’s common to put five or six billion transistors on a sliver of silicon.
This remarkable achievement is the fulfillment of a prediction made in 1965 by industry pioneer Gordon Moore: that the number of components on a chip would double every year for the foreseeable future. He later amended the time period to 24 months. His predictions, codified as Moore’s Law, have come to symbolize the seemingly inevitable march of technological progress–the ability to make all sorts of electronic devices faster, smaller and more energy efficient.
While Gordon’s prediction proved to be more prescient than he could have imagined, today, 50 years later, the chip industry is no longer able to clear the high bar he set, due largely to limits imposed by the laws of physics. To put things bluntly: Moore’s Law is hitting a wall, and that collision holds significant consequences for business and society. Unless scientists and engineers come up with bold new approaches to chip architectures and materials, technological progress will slow.
To accelerate progress, we need to invent the next switch.