By John Potter
When I speak to enterprise CIOs about the cloud, one issue comes up in conversation more than any other: security.
As the momentum grows around cloud services, enterprises are starting to move toward this model of computing, recognizing the benefits they can gain in terms of flexibility and scalability. However, the anticipated revolution is more of a slow evolution with a significant number of large businesses still sitting on the sidelines. The main reason for their reluctance: concerns over reliability, performance, and most of all, security.
The cloud may be a relatively new concept, but these concerns aren’t. For businesses, customer data and intellectual property are often the currency with the highest value. They demand a cloud that lets them protect this data using the same enterprise-grade security they’ve experienced in their existing corporate networks. They want to know that their most important currency is protected as it travels to and from the cloud. Continue Reading »
By Harry Kolar, IBM Distinguished Engineer and Sensor-Based Solutions
The rough seas off the coast of Ireland, where the North Atlantic can churn waves more than 15 meters high, are home to some of the largest concentrations of wave energy in the world. This turbulent seascape has for centuries served as both a sanctuary for marine life and a source of commerce and sustenance for the people of Ireland and Europe.
Now the waters of Galway Bay are providing something new: information.
After more than 18 months of design, development and research, the Sustainable Energy Authority of Ireland (SEAI) in association with IBM and the Marine Institute of Ireland last month turned on a massive data collection system that will capture and analyze – in real-time – the under water noise levels of the bay.
Initially, the system will capture and analyze the ambient noise of the ocean to establish a baseline of acoustics including natural and anthropogenic (man-made) sound sources including vessel traffic. But the ultimate goal is to capture and analyze the sounds and vibrations of hulking wave energy conversion machines that have begun bobbing along off the coast and help determine what, if any impact the sound waves from those devices could have on marine life – but especially highly sensitive dolphin, porpoise, and whale populations.
This year-long project is an offshoot of an effort launched in 2009 in Galway Bay by IBM and SEAI, called SmartBay. While much has been written about both of these projects, little has been said about the technology behind them. The “smarts,” if you will, of the SmartBay.
By Jean Noel Le Foll, General Manager, CFAO Technologies
Brazil, Russia, India, China, Turkey, South Africa and Mexico are the fastest growing markets for computer equipment, making up 14% of the global IT market. The regions increasing their IT purchases the most are the Middle East, Eastern Europe and Africa, according to Forrester Research. A growing list of companies in these emerging economies is relying on the IBM System z mainframe to build their infrastructures.
The Ministry of Senegal brought all of its import and export processes from across the country on-line with System z, and is now recovering 30% of Gross National Product, which amounts to two billion Senegalese francs in customs revenue every day. In the process, the Ministry increased the performance of its systems by 70%, reduced power consumption by 20% and cut operating costs by 30%.
Customs officers in Senegal and their partners now have real-time access to information across all of the country’s border checkpoints. They can check to see if the correct duty has been paid on shipments of goods coming through the country’s main border checkpoints This is a vast improvement over the Ministry’s previous system, which was limited to two border checkpoints. The Ministry of Senegal is using technology to apply critical information to boost the country’s economic growth.
My company, CFAO, also worked with the government in Cameroon to help them build their infrastructure on the mainframe. In Cameroon, the Cameroon Ministry of Finance is using a System z mainframe to help with smarter banking and modernize the payroll processes for government employees in the country. The new system is helping to increase the security of the Ministry’s payroll system and improve the efficiency of processes such as generating pay slips.
If you own a car in North America, you’re told to change the oil every 3,700 miles or six months. This applies whether you are living in Florida, driving peacefully to work, or living in Minnesota with frequent subzero temperatures in the winter. In Scandanavia, where I live, we change the oil in our vehicles less frequently because of concern about the environment.
But no matter what schedule you use, the point is that old-fashioned service manuals are not smart. Cars are used in different ways, and should therefore be serviced in different ways. And the same goes for any type of machinery.
Just because machines start out the same way doesn’t mean we should service them the same way. To determine how often we get vehicles serviced, we need to consider the environment in which the machines operate, and how they are being used. The trick, off course, is to figure out just that: where and how are they being used?
It all starts with collecting data. Sensors are becoming increasingly sophisticated. Using heat cameras, we can detect wear inside a ball bearing. Microphones can help us detect the slightest change in frequency of a motor and with accelerometers, which are small sensors that measure acceleration, we can record motion of robotic arms that will give away inconsistencies.
These sensors work much like the nervous system in our body. Each sensor on it’s own is somewhat useful, but when you start combining the sensory data from multiple sources along with statistics and previous recordings, you really start to leverage the potential. Feeling the ground tremble, hearing a train horn, and seeing that you are standing on train tracks, are of little value on their own, but combining the information might prove life saving. With such input, you know that taking one step to the side is smarter than running along the tracks.
This is what we call predictive maintenance. Measuring, in real-time, how machines are doing and combining it with statistics and knowledge to fix things before they break, not after. This gives customers the chance to plan for down time, and do repairs before faulty parts affects others. In many cases, they can limit repairs to a few dollars instead of thousands.
This can also be applied to the products already sold. A car manufacturer could put sensors in its cars, which would report on how the car is performing. This would give us a large dataset to find faults and errors, which would help evolve future products or make the servicing the cars smarter. In other words, letting the customer know that a part is about to break before it actually does.
On a smarter planet, we will stop treating cars — or machines — as a homogenous group. Since each one is used differently, it should be serviced by looking at the health of each part, and not when the booklet tells you it’s due for servicing.
By Harry van Dorenmalen
Chairman, IBM Europe
The first, Sequoia, is the world’s most powerful supercomputer, capable of calculating in one hour what would otherwise take 6.7 billion people using hand calculators 320 years to complete if they worked non-stop. It is installed at the National Nuclear Security Administration (NNSA)’s Lawrence Livermore National Laboratory in California.
The second is the first commercial machine, cooled by hot water, built for the Leibniz Supercomputing Centre in Germany. It will be used by scientists across Europe to drive a wide range of research − from simulating the blood flow behind an artificial heart valve, to devising quieter aeroplanes.
What’s impressive about these machines is not just their massive processing power alone, but they are remarkably energy efficient, too.
The Top500 ranking of supercomputers today recognized the Lawrence Livermore National Laboratory’s Sequoia as the fastest computer in the world. The computer, an IBM Blue Gene/Q, was designed to be extremely energy efficient. Like previous Blue Gene machines, it’s powered by low frequency and low power embedded PowerPC cores–in this case, an astonishing 1.6 million of them. Sequoia produces 16 petaflops of computation muscle. That’s 16 quadrillion operations per second. It’s an important stepping stone on the way to exascale computing–machines that will be 50 times as fast as today’s fastest.
Read a related post on the IBM Research blog.
When IBM unveiled its Smarter Planet agenda in late 2008, government and business leaders in Poland were intrigued, but the global financial crisis made it difficult for them to act on their positive impulses. Today, in spite of lingering concerns about the situation in Western Europe, the Smarter Planet concepts are starting to gain traction–especially with government leaders.
The Polish central government is launching an e-health initiative, a new citizen ID program and a new electronic tax filing system. “Smart is all about how to make the citizen’s life easier, safer and more ecologically sustainable,” says Anna Sienko, IBM’s general manager for Poland and the Baltic countries.
Poland is one of the fastest-growing economies in Europe right now, and business and government leaders are determined to stimulate growth through innovation. ICM, a research institute affiliated with the University of Warsaw, does its own research in everything from weather prediction to quantum computing but also provides computational power for other researchers throughout Poland. Here’s how ICM works:
By Andras Szakal
IBM US Federal CTO
A smarter government is more agile, more able to effectively respond to changing government needs and citizen dynamics. One of the best ways to improve the way our government works – both its operational efficiency as well as the services it provides to citizens – is through cloud computing.
Yesterday I participated in the Congressional High-Tech Caucus Cloud Task Force’s “Cloud Computing: A Primer” in Washington, DC as part of an industry panel which tackled issues critical to cloud utilization. The event was designed to help our legislators understand how to optimize IT and lower costs, reducing government waste. I was excited to be able to take this message to Congress, and appreciated the opportunity to join Rep. Michael McCaul (R-TX) and Rep. Doris Matsui (D-CA), co-chairs of the High Tech Caucus.
As citizens, there is a lot of reason to be excited about the promise of cloud computing to help our government operate more efficiently. We like to feel that our tax dollars are hard at work, and that maximum value is being squeezed out of every penny. Rapidly evolving advancements in cloud technologies in such areas as resource pooling, virtualization and operational automation must be considered to help transform and consolidate government data centers to ensure more effective use of resources and lower operational costs.
By Dr. John E. Kelly III
IBM Senior Vice President and Director of IBM Research
When I was a child, my father worked at General Electric’s research lab in Niskayuna, N.Y. I would visit and watch him tinker with vacuum tubes—light bulb-like devices that were used to direct electrical current in all sorts of gizmos, from radios and TVs to radar and computers. At the time, I didn’t fully understand what he was doing, but those visits inspired me to study science and, ultimately, to get degrees in physics and materials engineering.
I later came to understand that I had witnessed one of the great transitions in the history of technology. While my dad was showing me vacuum tubes, other engineers at GE’s lab were experimenting with the vacuum tube’s successor, the transistor, which ultimately ushered in modern electronics and personal computing. Those core technologies enabled computers that could be programmed to perform a wide variety of tasks.
Today, we are at the dawn of another epochal shift in the evolution of technology. At IBM Research, we call it the era of cognitive systems.
This is a big deal. The changes that are coming over the next 10 to 20 years—building on IBM’s Watson technology–will transform the way we live, work and learn, just as programmable computing has transformed the human landscape over the past 60+ years. You could even call this the post-computing era.