Just a few years ago, much of the software that IBM sold was operating systems and middleware. Vital stuff, to be sure, but not very sexy. The move to analytics has changed things. For example, we provide some really nifty software for New York’s U.S. Open Tennis Championships, which kick off today and will build to a crescendo in two weeks with the finals.
IBM’s sponsorship of the championships gives us the opportunity to showcase amazing new technologies for some of the most sophisticated tennis fans in the world. During this year’s championships, fans and broadcasters alike will be able to enjoy matches with a depth of understanding far beyond anything they have experienced at the tournament before. That’s thanks to U.S. Open PointStream, a new match analysis feature on the U.S. Open Web site.
PointStream represents a great leap forward for tennis fans. Last year, fans received a wealth of statistical information about players and matches on the site. But now, thanks to PointStream, they can access deep analysis spelling out what each player needs to do to increase their chances of winning a match, how the match is going in real time and when the momentum is shifting.
PointStream also signals a new level of technical sophistication emerging worldwide that is deepening our understanding of nearly every human endeavor. Thanks to new analytics capabilities, people are able to gather huge quantities of pertinent information about nearly any topic, extract insights, and get up-to-the second updates about what’s happening and why. At IBM, we call this the smarter planet.
When we started talking about the smarter planet nearly three years ago, it was a vision of what could be. Now, after more than 2,000 engagements with clients, it’s a firm reality.
Another person for a smarter planet
Amidst moving to a new country, starting a new school, making new friends, and digging into an intensive scientific research project, Alexander Amini still had time for tennis. Generally two hours a day, several days a week.
“I started playing tennis before I can remember,” Amini said. “I’ve always had a tennis racquet in my hand.”
Amini’s love of tennis served him well when he decided to enter Ireland’s national high school science competition–the BT Young Scientist and Technology Exhibition (BTYSTE)–upon moving to Dublin from New York state with his family last fall.
He enrolled in Castleknock College, a private boys high school, and devoted himself to writing software that can identify a tennis player’s strokes based on data transmitted from wireless sensors worn on the body.
“In my project I was able to automatically detect 13 different tennis strokes with an average accuracy of 95 percent,” Amini said. “For four of those strokes, the accuracy was above 99 percent.”
Last January, after four months of hard work, Amini won top prize out of a field of 513 entries and was named BT Young Scientist and Technologist 2011. In September, he will represent Ireland at the 23rd European Union Contest for Young Scientists in Helsinki, Finland.
Even though energy mavens have been advocating smart meter deployments for nearly a decade, an IBM survey of 10,000 people in 15 countries shows that consumers are confused about what a smart grid is and what it means to them. It’s startling new evidence that if you want a smarter planet, you have to communicate better about it.
Sixty percent of those surveyed did not know the meaning of the terms “smart grid” or “smart meters.” Half of them didn’t understand the term “time of use pricing,” which is essential to understanding the benefits these technologies offer such as improved reliability, lower costs and increased efficiency. Thirty percent were unaware of the basic mechanism used for charging for electricity–the amount paid per kilowatt hour.
This confusion helps explain why the consumer uptake has been slower than hoped for, according to Michael Valocchi, IBM’s vice president for Global Energy & Utilities. His prescription: Utilities, regulators, government officials and technology companies need to go back to basics when communicating with consumers. “Today, the industry is focused on engineering and regulatory matters. All the companies in the ecosystem have to connect better with the consumer.”
The following is a guest post authored by Ben Hodges, Associate Professor, University of Texas at Austin Center for Research in Water Resources.
Although many of us are sweltering in record-breaking heat, a recent Wall Street Journal story about the race to shore up aging, damaged levee systems along the Mississipi River reminds us that flood season is just around the corner. And according to the U.S. Army Corps of Engineers, the multi-billion dollar restoration won’t be done by spring.
Deciding where to begin is a complex task. But with the right mix of technology and expertise, engineers could have a snapshot of how a river and its tributaries will behave in flood situations and other extreme weather conditions, allowing them to prioritize levee restoration efforts according to which areas are at highest risk of flooding, and when that’s likely to happen.
This new flood prediction technology can simulate tens of thousands of river branches at a time and could scale further to predict the behavior of millions of branches simultaneously. By coupling analytics software with advanced weather simulation models, such as IBM’s Deep Thunder, municipalities and disaster response teams could make emergency plans and pinpoint potential flood areas on a river.
Floods are the most common natural disaster in the United States, but traditional flood prediction methods are focused only on the main stems of the largest rivers – overlooking extensive tributary networks where flooding actually starts, and where flash floods threaten lives and property.
As a testing ground, the team is presently applying the model to predict the entire 230 mile-long Guadalupe River and over 9,000 miles of tributaries in Texas. In a single hour the system can currently generate up to 100 hours of river behavior.
By combining IBM’s complex system modeling with UT Austin’s research into river physics, we’ve developed new ways to look at an old problem. Unlike previous methods, the IBM approach scales-up for massive networks and has the potential to simulate millions of river miles at once. With the use of river sensors integrated into web-based information systems, we can take this model even further.
In addition to flood prediction, a similar system could be used for irrigation management, helping to create equitable irrigation plans and ensure compliance with habitat conservation efforts. The models could allow managers to evaluate multiple “what if” scenarios to create better plans for handling both droughts and water surplus.
Since IBM CEO Sam Palmisano visited a team of scientists at IBM Research – Almaden a few weeks ago, he has been joking that they have designed a new computer chip that’s “as large as a worm’s brain.” It’s a good quip. Why would highly-talented researchers spend their time creating a chip like that? It’s also quite an understatement. True, the chip isn’t very smart, in itself. But it signals what could be the beginning of a major new computing architecture that compliments today’s computers. When Palmisano tells the story, he makes sure his audience knows how proud he is of IBM’s researchers.
The chip, a product of IBM’s three-year-old SyNAPSE project, could become a building block for a new generation of computers designed to emulate the animal brain’s abilities for sensing and cognition–all the while consuming many orders of magnitude less power and space than today’s computers. “We believe we have reproduced the core circuit of the brain in silicon,” says Dharmendra Modha, program lead for IBM Research’s cognitive computing department at the Almaden lab. “All mammal brains are built on the same blueprint. We believe that we have found the core design that encapsulates the key architectural principles of the brain.”
Twenty years ago, Finnish graduate student Linus Torvalds launched a revolution–but he didn’t know he was doing so at the time. He posted a notice on a computing message board saying he was creating a kernel, or central utility, for a “free” computer operating system. He planned on using components from the GNU open-source software portfolio and using a popular open-source license called GPL, so people could freely use and contribute to the software. He named his operating system Linux and invited anybody who wished to contribute code. This was “just a hobby,” he wrote.
Today Linux is one of the most important pieces of software on the planet. It runs the computers for major Web sites including Facebook, Amazon.com and Google; powers 75% of the stock exchanges worldwide; and is a core technology in 95% of the world’s supercomputers. Linux runs in many mobile phones and is a core ingredient of cloud computing.
And Torvalds? He’s a fellow at the Linux Foundation, which is the organization that coordinates Linux development and promotion. He works at his home office in Portland, Oregon, presiding over the continuing development of the kernel. Torvalds agreed to answer a few questions by email about Linux and what he’s up to. He declined to address big open-ended questions, such as what Linux has accomplished. He leaves such judgments to others.
Question: On the 20th anniversary, how do you feel about Linux and your role in its development?.
Torvalds: I’m very happy with how things are going. Twenty years into it, it’s still interesting and it’s still challenging. And it’s different, and it’s never gotten to be some boring daily grind. And while my role in it has changed from being a core developer to be more of a manager (but without the logistical side to it. I don’t need to “take care” of people, only worry about the technical side), I still feel that I add value, so I’m happy.
Here’s a video commemorating the anniversary from the Linux Foundation:
It’s amazing to me to think that August 12 marks the 30th anniversary of the IBM Personal Computer. The announcement helped launch a phenomenon that changed the way we work, play and communicate. Little did we expect to create an industry that ultimately peaked at more than 300 million unit sales per year. I’m proud that I was one of a dozen IBM engineers who designed the first machine and was fortunate to have lead subsequent IBM PC designs through the 1980s. It may be odd for me to say this, but I’m also proud IBM decided to leave the personal computer business in 2005, selling our PC division to Lenovo. While many in the tech industry questioned IBM’s decision to exit the business at the time, it’s now clear that our company was in the vanguard of the post-PC era.
I, personally, have moved beyond the PC as well. My primary computer now is a tablet. When I helped design the PC, I didn’t think I’d live long enough to witness its decline. But, while PCs will continue to be much-used devices, they’re no longer at the leading edge of computing. They’re going the way of the vacuum tube, typewriter, vinyl records, CRT and incandescent light bulbs.
PCs are being replaced at the center of computing not by another type of device—though there’s plenty of excitement about smart phones and tablets—but by new ideas about the role that computing can play in progress. These days, it’s becoming clear that innovation flourishes best not on devices but in the social spaces between them, where people and ideas meet and interact. It is there that computing can have the most powerful impact on economy, society and people’s lives.
Internships have long been a routine way for students to get a taste of the work-a-day world, and for companies to try win the affections of top talents when they’re on the verge of making career decisions. But there are internships and then there are INTERNSHIPS.
IBM’s Extreme Blue falls into the second category. Groups of four students spend 12 weeks working on a real-world technology problem, writing a business plan, and producing a prototype product or service that might end up being a piece of IBM’s portfolio. Since the program started in 1999, students have generated more than 500 patent submissions, helped create more than 50 new product capabilities, and contributed more than 15 pieces of software to the open-source community. This year, 48 students in the US and Canada participated in the program that wrapped up today with a series of presentations and product demos.
I asked some of the students to describe what was most valuable about their Extreme Blue experience. Here’s what they had to say:
Aya Catalina Ibarra