Instrumented Interconnecteds Intelligent
Smarter Computing

IBM CEO Ginni Rometty

In these early days of the 21st century, Big Data, analytics, cloud, mobile and social technologies are transforming our world.  This new era of computing provides the instrumentation, interconnection and intelligence that make it possible to build a smarter planet. But, in order to do so, countries, cities, corporations and individuals need to rethink how they go about achieving their goals. Watch this video of IBM CEO Ginni Rometty laying out her vision of the path forward at the Council on Foreign Relations–and her Q&A session with the audience. Join the conversation here and on Twitter at #IBM and #CFRlive. Here’s the speech.

YouTube Preview Image

Join us tomorrow to discuss how enterprises can take advantage of today’s cutting-edge technologies to become more competitive. IBM social business leader Jen Okimoto will lead a Smarter Friday chat on IBM’s People for a Smarter Planet Facebook page from 8:00 AM-2:00 PM ET. IBM social business evangelist Sandy Carter will host a #P4SPchat from 12:00 PM-1:00 PM ET on Twitter.

Bookmark and Share

By Richard Silberman, Writer/Researcher, IBM Communications

During the past year, we’ve profiled nine exceptional “People for a Smarter Planet” who exemplify the spirit of change, innovation, creativity and curiosity that lie at the core of building a smarter planet. They are inventors and researchers, academics and executives, thought leaders, dreamers, risk-takers, pioneers.

These individuals come from a wide range of fields and possess an array of interests and expertise. What they all have in common is a passion for their work and a commitment to make the world a better place.

They include Ruhong Zhou, whose avian flu research may help prevent a global pandemic; Dave Bartlett, IBM’s smarter buildings guru; Bill Reichert, a Silicon Valley venture capitalist with novel advice for entrepreneurs; and sustainability expert Sarah Slaughter.

If you haven’t met them yet, here are nine People for a Smarter Planet you should know.

Bookmark and Share

Dr. Lubomyr Romankiw, IBM Fellow in Electrochemical Technology, Micromagnetics and Microfabrication

By Richard Silberman, Writer/Researcher, IBM Communications

Without Lubomyr Romankiw, building a smarter planet would be much more difficult, if not impossible. Personal computers, smart phones, digital cameras and DVRs may have taken much longer to become a reality. ATMs, the Internet, Blue Gene and cloud computing might still be far off fantasies.

The world as we know and enjoy it today – with its ubiquitous computers and data-storing devices – is almost unimaginable without the magnetic thin-film disk storage technology and the read-and-write magnetic head that Dr. Romankiw and Dr. David A. Thompson invented at IBM 40 years ago.

The thin-film magnetic recording head is the tiny component that reads and writes data in virtually every disk-based storage device made since 1979. Before Dr. Romankiw’s inventions of thin-film heads and the processing technology to fabricate them, data storage for even the most cutting-edge computers was cumbersome, slow and expensive.

Continue Reading »

Bookmark and Share

Bernard Meyerson, Chief Innovation Officer, IBM

By Bernard Meyerson

It’s amazing when you look back over the 60+ years of the computing revolution and see how far we have come in such a relatively short time. The first electronic programmable computers, built in the 1940s, were essentially really fast electronic calculators. Then came the mainframe, the PC, the Internet and social networking. Today, we’re entering the era of cognitive computing–machines that help us think.

IBM’s Watson marks a turning point.  The former Jeopardy! TV quiz show champ is now reading millions of pages of medical text in preparation for going to work in healthcare. But while Watson can understand all manner of things and learns from its interactions with data and humans, it is just a first step into a new era of computing that’s going to produce machines that are as distinct from today’s computers as those computers are from the mechanical tabulating devices that preceded them. A host of technologies are coming that will help us overcome our limitations and will transform the way we interact with machines and with each other.

One of the most intriguing aspects of this shift is our ability to give machines some of the capabilities of the right side of the human brain. New technologies make it possible for machines to mimic and augment the senses. Today, we see the beginnings of sensing machines in self-parking cars and biometric security–and the future is wide open. This year, we focused the IBM Next 5 in 5, our 2012 forecast of inventions that will change your world in the next five years, on how computers will mimic the senses:

Touch:        You will be able to reach out and touch through your phone
Sight:          A pixel will be worth a thousand words
Hearing:     Computers will hear what matters
Taste:          Digital taste buds will help you to eat healthier
Smell:          Computers will have a sense of smell

Join the Twitter conversation at #ibm5in5. Click here to vote on the coolest predictions, and check back on the blog Dec. 21 for the results.

Continue Reading »

Bookmark and Share

John Potter, Vice President, Hosting, Applications and Cloud Solutions, AT&T Business Solutions

By John Potter

When I speak to enterprise CIOs about the cloud, one issue comes up in conversation more than any other: security.

As the momentum grows around cloud services, enterprises are starting to move toward this model of computing, recognizing the benefits they can gain in terms of flexibility and scalability. However, the anticipated revolution is more of a slow evolution with a significant number of large businesses still sitting on the sidelines. The main reason for their reluctance: concerns over reliability, performance, and most of all, security.

The cloud may be a relatively new concept, but these concerns aren’t. For businesses, customer data and intellectual property are often the currency with the highest value. They demand a cloud that lets them protect this data using the same enterprise-grade security they’ve experienced in their existing corporate networks. They want to know that their most important currency is protected as it travels to and from the cloud. Continue Reading »

Bookmark and Share
September 12th, 2012
10:00
 

By Harry Kolar,  IBM Distinguished Engineer and Sensor-Based Solutions

The rough seas off the coast of Ireland, where the North Atlantic can churn waves more than 15 meters high, are home to some of the largest concentrations of wave energy in the world. This turbulent seascape has for centuries served as both a sanctuary for marine life and a source of commerce and sustenance for the people of Ireland and Europe.

 

Now the waters of Galway Bay are providing something new: information.

After more than 18 months of design, development and research, the Sustainable Energy Authority of Ireland (SEAI) in association with IBM and the Marine Institute of Ireland last month turned on a massive data collection system that will capture and analyze – in real-time – the under water noise levels of the bay.

Initially, the system will capture and analyze the ambient noise of the ocean to establish a baseline of acoustics including natural and anthropogenic (man-made) sound sources including vessel traffic. But the ultimate goal is to capture and analyze the sounds and vibrations of hulking wave energy conversion machines that have begun bobbing along off the coast and help determine what, if any impact the sound waves from those devices could have on marine life – but especially highly sensitive dolphin, porpoise, and whale populations.

This year-long project is an offshoot of an effort launched in 2009 in Galway Bay by IBM and SEAI, called SmartBay. While much has been written about both of these projects, little has been said about the technology behind them. The “smarts,” if you will, of the SmartBay.

Continue Reading »

Bookmark and Share
August 28th, 2012
0:10
 

By Jean Noel Le Foll, General Manager, CFAO Technologies

Brazil, Russia, India, China, Turkey, South Africa and Mexico are the fastest growing markets for computer equipment, making up 14% of the global IT market. The regions increasing their IT purchases the most are the Middle East, Eastern Europe and Africa, according to Forrester Research. A growing list of companies in these emerging economies is relying on the IBM System z mainframe to build their infrastructures.

The Ministry of Senegal brought all of its import and export processes from across the country on-line with System z, and is now recovering 30% of Gross National Product, which amounts to two billion Senegalese francs in customs revenue every day. In the process, the Ministry increased the performance of its systems by 70%, reduced power consumption by 20% and cut operating costs by 30%.

Customs officers in Senegal and their partners now have real-time access to information across all of the country’s border checkpoints. They can check to see if the correct duty has been paid on shipments of goods coming through the country’s main border checkpoints This is a vast improvement over the Ministry’s previous system, which was limited to two border checkpoints. The Ministry of Senegal is using technology to apply critical information to boost the country’s economic growth.

My company, CFAO, also worked with the government in Cameroon to help them build their infrastructure on the mainframe. In Cameroon, the Cameroon Ministry of Finance is using a System z mainframe to help with smarter banking and modernize the payroll processes for government employees in the country. The new system is helping to increase the security of the Ministry’s payroll system and improve the efficiency of processes such as generating pay slips.

Continue Reading »

Bookmark and Share

By Christopher Conradi, Business Analytics, IBM

If you own a car in North America, you’re told to change the oil every 3,700 miles or six months. This applies whether you are living in Florida, driving peacefully to work, or living in Minnesota with frequent subzero temperatures in the winter. In Scandanavia, where I live, we change the oil in our vehicles less frequently because of concern about the environment.

But no matter what schedule you use, the point is that old-fashioned service manuals are not smart. Cars are used in different ways, and should therefore be serviced in different ways. And the same goes for any type of machinery.

YouTube Preview Image

Just because machines start out the same way doesn’t mean we should service them the same way. To determine how often we get vehicles serviced, we need to consider the environment in which the machines operate, and how they are being used. The trick, off course, is to figure out just that: where and how are they being used?

It all starts with collecting data. Sensors are becoming increasingly sophisticated. Using heat cameras, we can detect wear inside a ball bearing. Microphones can help us detect the slightest change in frequency of a motor and with accelerometers, which are small sensors that measure acceleration, we can record motion of robotic arms that will give away inconsistencies.

These sensors work much like the nervous system in our body. Each sensor on it’s own is somewhat useful, but when you start combining the sensory data from multiple sources along with statistics and previous recordings, you really start to leverage the potential. Feeling the ground tremble, hearing a train horn, and seeing that you are standing on train tracks, are of little value on their own, but combining the information might prove life saving. With such input, you know that taking one step to the side is smarter than running along the tracks.

This is what we call predictive maintenance. Measuring, in real-time, how machines are doing and combining it with statistics and knowledge to fix things before they break, not after. This gives customers the chance to plan for down time, and do repairs before faulty parts affects others. In many cases, they can limit repairs to a few dollars instead of thousands.

This can also be applied to the products already sold. A car manufacturer could put sensors in its  cars, which would report on how the car is performing. This would give us a large dataset to find faults and errors, which would help evolve future products or make the servicing the cars smarter. In other words, letting the customer know that a part is about to break before it actually does.

On a smarter planet, we will stop treating cars — or machines — as a homogenous group. Since each one is used differently, it should be serviced by looking at the health of each part, and not when the booklet tells you it’s due for servicing.

 

Please continue the conversation with Chris on Friday, September 12 on IBM’s People for a Smarter Planet Facebook to learn more on predictive maintenance.  

Bookmark and Share

By Harry van Dorenmalen
Chairman, IBM Europe

Recently, IBM detailed two new computers that will help change the way the world, literally, works.

The first, Sequoia, is the world’s most powerful supercomputer, capable of calculating in one hour what would otherwise take 6.7 billion people using hand calculators 320 years to complete if they worked non-stop. It is installed at the National Nuclear Security Administration (NNSA)’s Lawrence Livermore National Laboratory in California.

The second is the first commercial machine, cooled by hot water, built for the Leibniz Supercomputing Centre in Germany. It will be used by scientists across Europe to drive a wide range of research ­­− from simulating the blood flow behind an artificial heart valve, to devising quieter aeroplanes.

What’s impressive about these machines is not just their massive processing power alone, but  they are remarkably energy efficient, too.

Continue Reading »

Bookmark and Share

The Top500 ranking of supercomputers today recognized  the Lawrence Livermore National Laboratory’s Sequoia as the fastest computer in the world. The computer, an IBM Blue Gene/Q, was designed to be extremely energy efficient. Like previous Blue Gene machines, it’s powered by low frequency and low power embedded PowerPC cores–in this case, an astonishing 1.6 million of them. Sequoia produces 16 petaflops of computation muscle. That’s 16 quadrillion operations per second. It’s an important stepping stone on the way to exascale computing–machines that will be 50 times as fast as today’s fastest.

YouTube Preview Image

Read a related post on the IBM Research blog.

 

Bookmark and Share

Subscribe to this category Subscribe to Smarter Computing