By Drew Johnson
As the Internet of Things (IoT) and Machine-to-Machine (M2M) communications connect millions of diverse machines over networks, the goal is to make the combination of those machines greater than the sum of each type and to provide people with greater information and insight as the ecosystem expands.
To achieve this level of interconnectivity, businesses that depend on those machines need them to work reliably, securely, and cost-effectively – without human intervention. That’s where an unexpected technology function comes in to help: crowdsourcing.
Crowdsourcing typically conjures up images of people-driven programs, like traffic information gathered from thousands of commuters or weather reports created by people supplying pictures and information from their mobile devices. Continue Reading »
By Jeffrey Welser
One of the watershed moments in the history of computing took place on Dec. 9, 1968. Douglas Engelbart and his team at Stanford Research Institute presented a technology demonstration that included the first public showings of the computer mouse, hypertext, dynamic file linking and shared-screen collaboration over a network. Those advances turned out to be essential building blocks for personal computing and Internet, and the event came to be called “The Mother of All Demos.”
While only history will say for sure, I think we saw the glimmer of a similar new beginning last week at IBM Research – Almaden, in Silicon Valley. The IBM Cognitive Systems Colloquium signaled a shift from a singular focus on the von Neumann computing architecture, which has dominated computer science and the computer industry since the mid-1940s, to new architectures modeled on the human brain. Continue Reading »
By Jonathan Schaeffer
At the just-concluded G20 summit in Brisbane, Australia, the leaders of the 20 major economies in the world agreed to “take strong and effective action” on climate change.
Still, at this critical juncture in the history of our planet, it is essential that the scientific world continue to document the dramatic climate changes occurring all across the globe.
One technological area gaining wider use is remote sensing. Today sensors are powerful and inexpensive, network access to remote data is increasing, scientific models are improving, and “big data” algorithms for crunching the numbers are more accessible. Continue Reading »
By Michael Dixon
While there is always interest in the exciting innovations in cities – such as intelligent transportation systems – the backbone of any city operation is comprised of efficient water pipes and reliable electrical wires.
The availability, delivery and consumption of natural resources like energy and water is far more important to cities than a new fleet of busses. Optimizing resources is particularly relevant for cities because of their impact on both livability as well as resilience. Continue Reading »
By Laurent Auguste
With more than half of the world’s population living in urban areas, cities have proven to have the winning model.
But the massive influx into cities leads to higher population densities, greater complexities and increased pressures on local resources, such as water.
In the future, successful cities will be those that have created local and global access to Big Data as sources of new game-changing dynamics. New city models will turn the passive pipes of city infrastructure into active ones, transcending their current use and freeing up yet untapped value.
By Jeff Schick
For more than 30 years, email has been stuck in a rut. It’s still basically a list of messages that we plow through all day, every day—in our private and professional lives. The important stuff is hidden among the trivial and the routine. Sure, you can fiddle with rankings and do rudimentary searches, but, for all the time we spend dealing with our email, it’s one of the least-evolved computer activities around. Think of it as a tax on your brain.
I probably speak for many people when I say that the first word that comes to mind when I think of email is “frustration.” Actually, the word that comes to mind is less polite than that. That high level of collective frustration is what drove a talented team of software engineers and user experience designers at IBM to reimagine the domain—putting people and relationships at the center of things. Continue Reading »
By Dr. John E. Kelly III
The microprocessor was one of the most important inventions of the 20th century. Those chips of silicon and copper have come to play such a vital role that they’re frequently referred to as the “brains” of the computer. Today’s computer designs put the processor at the center.
But the needs of businesses and society are changing rapidly, so the computer industry must respond with a new approach to computer design—which we at IBM call data-centric computing. In the future, much of the processing will move to where the data resides, whether that’s within a single computer, in a network or out on the cloud. Microprocessors will still be vitally important, but their work will be divided up.
This shift is necessary because of the explosion of big data. Every day, society generates an estimated 2.5 billion gigabytes of data—everything from corporate ledgers to individual health records to personal Tweets.
By Michael Nova M.D.
To describe me as a health nut would be a gross understatement. I run five days a week, bench press 275 pounds, do 120 pushups at a time, and surf the really big waves in Indonesia. I don’t eat red meat, I typically have berries for breakfast and salad for dinner, and I consume an immense amount of kale—even though I don’t like the way it tastes. My daily vitamin/supplement regimen includes Alpha-lipoic acid, Coenzyme Q and Resveratrol. And, yes, I wear one of those fitness gizmos around my neck to count how many steps I take in a day.
I have been following this regimen for years, and it’s an essential part of my life.
For anybody concerned about health, diet and fitness, these are truly amazing times. There’s a superabundance of health and fitness information published online. We’re able to tap into our electronic health records, we can measure just about everything we do physically, and, thanks to the plummeting price of gene sequencing, we can map our complete genomes for as little as $3000 and get readings on smaller chunks of genomic data for less than $100.
Think of it as your own personal health big-data tsunami. Continue Reading »
One of the most intriguing elements of the new era of cognitive computing is the development of brain-inspired technologies. Those are technologies that mimic the functioning of the neurons, axons and synapses in the mammal brain with the goal of interpreting the physical world and processing sensory data: sight, sound, touch and smell. Today’s IBM Research Cognitive Systems Colloquium at IBM Research – Almaden is focusing on this realm of the cognitive computing world. Please come back for frequent reports and updates, and join the conversation at #cognitivecomputing. Continue Reading »
One of the most intriguing research projects at the Almaden lab over the past decade has been the development of a neurosynaptic microchip modeled on the workings of the brain. Funded since 2008 by the U.S. Defense Advanced Research Projects Agency’s SyNAPSE initiative, a team at Almaden led by Dharmendra S. Modha created not only a radically new chip architecture but a new approach to creating software applications.
Tomorrow, their work begins the transition from a science research project to a technology that’s on its way into the commercial marketplace. Continue Reading »