By Dario Gil
Silicon deserves lot of credit for enabling the digital revolution. Silicon-based chips power everything from cell phones to supercomputers.
Light is another critical factor in our digital lives. Behind the scenes, fiber optic cables carry a flood of voice and data communications for the Internet, telephone lines and cable TV.
But I believe that the real magic happens when light and silicon meet–in the realm of silicon photonics.
IBM Research scientists and engineers have achieved a major milestone that could accelerate progress in this area. They have invented a silicon photonics device that combines electrical and optical components on a single chip, and which can be mass-produced using conventional chip manufacturing techniques. Read about the technical details here.
This breakthrough paves the way for game-changing advances in everything from high-performance computing to Internet-scale data centers. By easing data traffic jams in all sorts of computing and communications systems, our technology enables cloud computing and big data analytics to achieve their full potential.
By Steve Hamm
The idea of making machines modeled on the human brain has thrilled and confounded scientists since the earliest days of computing in the 1940s. The brain is a remarkable organ. Thanks to this spongy mass the size of a grapefruit, which uses just 20 watts of power, we humans understand complex concepts, navigate the physical world, and create marvelous things—from spacecraft to sonnets.
Not surprisingly, imitating the brain has proven to be incredibly difficult. Conventional computers don’t even try. They use linear logic and hard-wired circuitry to calculate, send messages, analyze data and organize knowledge consuming enormous amounts of power while failing to match the brain’s protean capabilities.
But, today, we’re at a turning point in the history of computing. The SyNAPSE team at IBM Research, funded by the U.S. Defense Advanced Research Projects Agency and aided by scientists from several universities, has demonstrated powerful yet energy-efficient neuromorphic chip that has the potential to help fulfill the dreams of the computer industry’s pioneers. “I hope this will inspire completely different thinking about what computing can do,” says Dharmendra S. Modha, IBM Fellow and principal investigator of the SyNAPSE Project.
An article about the breakthrough was published today by Science magazine.
By Chris Sciacca
To create more energy-efficient clouds, crunch Big Data faster and design smaller, instrumented devices for a smarter planet, we need a new generation of technologies. This new generation will require even further improvements at the nano-scale to create more efficient transistors.
But before these microscopic technologies go into mass production, new techniques are needed for creating microscopic prototypes smaller than 30 nanometers — the size when prototyping becomes increasingly difficult. In 2010 IBM scientists published a paper in the peer-reviewed journal of Science demonstrating the technique by designing a nano-sized map of the world and now in 2014 the research is coming to market. Continue Reading »
By Steve Hamm
I have always loved science, though I was never that good at it in school. So it’s a major pleasure–as well as a bit ironic–for me to reveal that I’m one of fewer than one thousand people in the world who have moved an atom.
I got to do this a few weeks back in the nanoscience lab at IBM Research-Almaden in San Jose, Calif. The lab has been a hotbed of atom moving for decades, and a small team of scientists there is now pushing the boundaries of science in hopes of producing knowledge that will help people design and build quantum computers some day.
IBM Research scientist Robert Dennard, who at age 80 still comes to work at the lab nearly every day, has been awarded the Kyoto Prize—one of the world’s most prestigious recognitions for personal achievement. He will receive the Advanced Technology Prize in Electronics at a November ceremony in Japan.
Dennard, an IBM Fellow, is best known for inventing the memory chip in 1967. The simplicity, low cost and low power consumption of his invention, dynamic random access memory (DRAM), opened the door to the personal computer. Today, memory chips are used in every PC, laptop computer, game console and mobile communications device. Continue Reading »
When IBM set about planning its first research lab in Africa, announced on Monday, one of the goals was to choose areas of science and technology that would truly resonate on the continent. IBMers consulted with government, university, business and civic leaders to identify their priorities. In the end, they decided that the initial focuses of the lab would be e-government, using technology to improve traffic and water systems and designing programs that would help build up science and technology skills. The strategy is to help create innovation ecosystems around things that matter in Africa.
The lab, IBM Research – Africa, will be located in Kenya’s capital, Nairobi, and is expected to eventually have additional branches in other countries. Even before establishing the lab, IBM Research has been active on the continent. For instance, through the mFarming project, it’s developing mobile data communications technologies that will provide Kenyan farmers with advisory services that can help them increased their yields.
IBM has been opening research laboratories in emerging markets since the mid-1990s, and, always, one of the goals is syncing with the concerns of the people and contributing to economic and social development. This approach reflects the company’s recognition that emerging markets don’t necessarily emerge on their own; sometimes countries and regions need help in capacity building. Simply put: You have to help bake the economic pie before you can cut yourself a slice.
by Professor Jacques Beauvais, Vice-rector for Research, Université de Sherbrooke and Dave Danovitch, Senior Technical Staff Member, IBM Bromont, Microelectronics Division
What do a smart phone, computer, navigation system and digital scanner have in common? Microchip technology.
Technology is now engrained in every facet of our lives with new products and applications commercialized at a rapid pace. Consider this: We are already more than 2 billion people on the Internet and by 2015, there will be more than 1 billion smart phones in circulation–in a world inhabited by 7 billion people .
This is opening the boundaries to endless opportunities. Governments, universities and enterprises worldwide are now tapping into the development of the next generation of technologies able to transport data faster with enhanced speed, quality and energy.
With massive investment in leading edge research and development activities in the microelectronic sector, we will accelerate the go-to-market process. Already, some activities are surfacing in the northeastern part of North America, stretching from East Fishkill in New York State to the Eastern Townships in Bromont, Canada in the microelectronics industry.
By Bernie Meyerson
IBM Vice-President for Innovation
Our future generations face tremendous challenges such as the availability of adequate supplies of energy, water, and food. Similarly, gridlock will become all too common as the number of cars on the road, and the complexity of the roadways they occupy, increase almost exponentially.
Society’s sheer complexity has now gone beyond what individual humans can deal with themselves. Modern society is a deeply entwined system of systems. We are building complex systems on top of others, such that the only way to address the resulting complexity is through information technology. Without autonomic solutions to the vast challenges we face, somebody could sit staring at video images of traffic jams all day, or data on an energy shortfall, and yet not have a clue what to do to prevent them.
To address all these fundamental issues – the availability of food, the creation of jobs, and the very basic things we look for in society – we need to come up with information technology and the underlying software to enable us to understand and resolve them. This is why IBM continues to invest in fundamental research on smarter information technology systems. We’re extending that global investment today by establishing a new research and development centre in Canada, together with seven universities and investment from the federal and provincial governments.
This is the latest in an occasional series of posts about A New Era of Computing. A monumental shift is coming. Computing will be ubiquitous and machines will learn from their interactions with data and humans–essentially programming themselves. This leap will be enabled by advances in artificial intelligence, data analytics, computing systems and nanotechnology. It will result in a smarter, better planet.
Quantum computing has been a Holy Grail for researchers ever since Nobel Prize physicist Richard Feynman in 1981 challenged the scientific community to build computers based on quantum mechanics. For decades, the pursuit remained firmly in the theoretical realm. But now scientists and entrepreneurs believe they’re on the cusp of building systems that will take computing to a whole new level. “The work we’re doing shows it’s no longer just a brute force physics experiment. It’s time to start creating systems based on this science,” says IBM scientist Matthias Steffen, part of a team at IBM Research that’s focused on developing quantum computing to a point where it can be applied to real-world problems.
Here’s Steffen explaining the latest breakthroughs:
An achievement made last year by IBM scientists can’t really compare with the largest collection of Charlie’s Angels memorabilia (5,569 items), the most body piercings in one session (3,900) or the longest cucumber (47 inches), all Guinness World Records, but IBM’s nanotech experts have attained a Guinness record of their own. Their feat: creating the smallest 3D map of Earth.
The map, produced on a tiny sliver of polymer, measures just 22 by 11 micrometers. To put that into perspective, 1000 copies of the map could fit within a single grain of salt.
The Guinness World Record organization recognized the handiwork of IBM scientists in Zurich, Switzerland, and Almaden, Calif., in its new book, Guinness World Records 2012. (Officially they are no longer called the Guinness Book of World Records.)