When IBM set about planning its first research lab in Africa, announced on Monday, one of the goals was to choose areas of science and technology that would truly resonate on the continent. IBMers consulted with government, university, business and civic leaders to identify their priorities. In the end, they decided that the initial focuses of the lab would be e-government, using technology to improve traffic and water systems and designing programs that would help build up science and technology skills. The strategy is to help create innovation ecosystems around things that matter in Africa.
The lab, IBM Research – Africa, will be located in Kenya’s capital, Nairobi, and is expected to eventually have additional branches in other countries. Even before establishing the lab, IBM Research has been active on the continent. For instance, through the mFarming project, it’s developing mobile data communications technologies that will provide Kenyan farmers with advisory services that can help them increased their yields.
IBM has been opening research laboratories in emerging markets since the mid-1990s, and, always, one of the goals is syncing with the concerns of the people and contributing to economic and social development. This approach reflects the company’s recognition that emerging markets don’t necessarily emerge on their own; sometimes countries and regions need help in capacity building. Simply put: You have to help bake the economic pie before you can cut yourself a slice.
by Professor Jacques Beauvais, Vice-rector for Research, Université de Sherbrooke and Dave Danovitch, Senior Technical Staff Member, IBM Bromont, Microelectronics Division
What do a smart phone, computer, navigation system and digital scanner have in common? Microchip technology.
Technology is now engrained in every facet of our lives with new products and applications commercialized at a rapid pace. Consider this: We are already more than 2 billion people on the Internet and by 2015, there will be more than 1 billion smart phones in circulation–in a world inhabited by 7 billion people .
This is opening the boundaries to endless opportunities. Governments, universities and enterprises worldwide are now tapping into the development of the next generation of technologies able to transport data faster with enhanced speed, quality and energy.
With massive investment in leading edge research and development activities in the microelectronic sector, we will accelerate the go-to-market process. Already, some activities are surfacing in the northeastern part of North America, stretching from East Fishkill in New York State to the Eastern Townships in Bromont, Canada in the microelectronics industry.
By Bernie Meyerson
IBM Vice-President for Innovation
Our future generations face tremendous challenges such as the availability of adequate supplies of energy, water, and food. Similarly, gridlock will become all too common as the number of cars on the road, and the complexity of the roadways they occupy, increase almost exponentially.
Society’s sheer complexity has now gone beyond what individual humans can deal with themselves. Modern society is a deeply entwined system of systems. We are building complex systems on top of others, such that the only way to address the resulting complexity is through information technology. Without autonomic solutions to the vast challenges we face, somebody could sit staring at video images of traffic jams all day, or data on an energy shortfall, and yet not have a clue what to do to prevent them.
To address all these fundamental issues – the availability of food, the creation of jobs, and the very basic things we look for in society – we need to come up with information technology and the underlying software to enable us to understand and resolve them. This is why IBM continues to invest in fundamental research on smarter information technology systems. We’re extending that global investment today by establishing a new research and development centre in Canada, together with seven universities and investment from the federal and provincial governments.
This is the latest in an occasional series of posts about A New Era of Computing. A monumental shift is coming. Computing will be ubiquitous and machines will learn from their interactions with data and humans–essentially programming themselves. This leap will be enabled by advances in artificial intelligence, data analytics, computing systems and nanotechnology. It will result in a smarter, better planet.
Quantum computing has been a Holy Grail for researchers ever since Nobel Prize physicist Richard Feynman in 1981 challenged the scientific community to build computers based on quantum mechanics. For decades, the pursuit remained firmly in the theoretical realm. But now scientists and entrepreneurs believe they’re on the cusp of building systems that will take computing to a whole new level. “The work we’re doing shows it’s no longer just a brute force physics experiment. It’s time to start creating systems based on this science,” says IBM scientist Matthias Steffen, part of a team at IBM Research that’s focused on developing quantum computing to a point where it can be applied to real-world problems.
Here’s Steffen explaining the latest breakthroughs:
An achievement made last year by IBM scientists can’t really compare with the largest collection of Charlie’s Angels memorabilia (5,569 items), the most body piercings in one session (3,900) or the longest cucumber (47 inches), all Guinness World Records, but IBM’s nanotech experts have attained a Guinness record of their own. Their feat: creating the smallest 3D map of Earth.
The map, produced on a tiny sliver of polymer, measures just 22 by 11 micrometers. To put that into perspective, 1000 copies of the map could fit within a single grain of salt.
The Guinness World Record organization recognized the handiwork of IBM scientists in Zurich, Switzerland, and Almaden, Calif., in its new book, Guinness World Records 2012. (Officially they are no longer called the Guinness Book of World Records.)
By Dario Gil
When IBM’s Watson defeated two past champions on TV’s Jeopardy! game show last February, it awoke many people to the awesome power of computing. Watson demonstrates that computers are at last becoming learning systems–capable of consuming vast amounts of information about the world, learning from it and drawing conclusions that can help humans make better decisions.
At IBM Research, we believe that learning systems will shape the future of information science and the IT industry, and that Watson represents a very significant step on that journey.
But every innovator needs a target to aim for, so, after the Jeopardy! challenge, we’re searching for the next “grand challenge” to will drive the next advances in Information Technology. To help shape our thinking, we’re engaging in a conversation about the future of computing with scientists and business leaders at an IBM Research Colloquium on Friday at the lab in Yorktown Heights, N.Y. The questions we’re asking are straightforward: What should the next grand challenge be? How should we design it? How should we pursue it?
We want to throw a wider net, as well. The Jeopardy! contest inspired a team of IBM and university researchers to create a system that could beat the best Jeopardy! champions. What “grand challenge” would you choose? Hopefully, the colloquium and follow-up conversations will help us set an audacious goal.
( To follow live blogging from the colloquium from 10 a.m. to 5:15 p.m. on Friday, bookmark here and come back when the event is live.)
by Bernie Meyerson, IBM Fellow and VP of Innovation
One of the stubborn facts about the laws of physics is that they apply pretty much universally around the globe. When you have spent your career at the forefront of the semiconductor industry by shrinking generation upon generation of chip technology, facing up to the laws of physics, or more aptly having them come home to roost, can be difficult for those who are still on the treadmill of the past.
This is perhaps why a tendency persists among some in the chip industry to continue to reassure users of information technology that Moore’s Law or some vestige thereof survives. This famous axiom, coined by Intel co-founder Gordon Moore, predicts that the number of transistors that can be placed on a chip will double about every two years. Even if this remains true for some short time into the future, it will not continue for several reasons: 1) atoms don’t scale; 2) silicon devices go “quantum mechanical” at dimensions of about 7 nanometers; and 3) light is getting too slow and electrical signaling even slower.