By Dr. Erica Ollmann Saphire
The current outbreak of the Ebola virus is the largest in history, and has been described by the World Health Organization as “the most severe acute public health emergency seen in modern times.”
While previous outbreaks have ended when the disease was contained and disappeared from the human population, the scope of the 2014 outbreak raises the possibility that the virus, rather than disappearing again, could become endemic – permanently persisting in human populations in one or more areas. Continue Reading »
We all knew it was coming eventually and IBM predicted it would happen this year as it indeed did – more Thanksgiving shoppers turned to their mobile devices than their desktops to browse through all of the Thanksgiving deals.
Specifically, from IBM Digital Analytics Benchmark we saw 52.1 percent of traffic on Thanksgiving coming from mobile devices, a 22 percent jump from last year. Even more incredibly, if you go back to 2010 only 6.5 percent of traffic came from mobile. That is an eight-fold increase in traffic over only four years.
With regard to sales, mobile devices accounted for 32.3 percent, which percentage-wise was an even greater year-over-year increase at 25.4 percent than the observed increase in mobile traffic. Continue Reading »
By Paul Papas
Picture an architect laboring over a blueprint, or an auto designer working out the basics of next year’s model. Once upon a time, this mental image probably included a drafting table and a clay model, but not much else.
With some variation, those were creative tools that designers, architects and artists relied on to render their inspirations, refine them into concepts, and finalize them into market-ready products.
Fast forward to the era of high-performance computing and how this has radically transformed the creative process in pharmaceuticals, automotive, and government R&D – where the trials, mistakes and amazing breakthroughs were rendered, tested and proven in silicon, before they were realized in factories. Continue Reading »
By Wayne Balta
After years of progress, deforestation of the Amazon basin in Brazil has increased for the past two years running. It rose by 29% in the last recorded year, according to a recent report from the Brazilian government.
The Nature Conservancy, which is the largest environmental advocacy group in the world, has adopted a promising approach to addressing deforestation, which it calls “conservation with development.” Continue Reading »
By Jonathan Schaeffer
At the just-concluded G20 summit in Brisbane, Australia, the leaders of the 20 major economies in the world agreed to “take strong and effective action” on climate change.
Still, at this critical juncture in the history of our planet, it is essential that the scientific world continue to document the dramatic climate changes occurring all across the globe.
One technological area gaining wider use is remote sensing. Today sensors are powerful and inexpensive, network access to remote data is increasing, scientific models are improving, and “big data” algorithms for crunching the numbers are more accessible. Continue Reading »
By Jeff Schick
For more than 30 years, email has been stuck in a rut. It’s still basically a list of messages that we plow through all day, every day—in our private and professional lives. The important stuff is hidden among the trivial and the routine. Sure, you can fiddle with rankings and do rudimentary searches, but, for all the time we spend dealing with our email, it’s one of the least-evolved computer activities around. Think of it as a tax on your brain.
I probably speak for many people when I say that the first word that comes to mind when I think of email is “frustration.” Actually, the word that comes to mind is less polite than that. That high level of collective frustration is what drove a talented team of software engineers and user experience designers at IBM to reimagine the domain—putting people and relationships at the center of things. Continue Reading »
By Dr. John E. Kelly III
The microprocessor was one of the most important inventions of the 20th century. Those chips of silicon and copper have come to play such a vital role that they’re frequently referred to as the “brains” of the computer. Today’s computer designs put the processor at the center.
But the needs of businesses and society are changing rapidly, so the computer industry must respond with a new approach to computer design—which we at IBM call data-centric computing. In the future, much of the processing will move to where the data resides, whether that’s within a single computer, in a network or out on the cloud. Microprocessors will still be vitally important, but their work will be divided up.
This shift is necessary because of the explosion of big data. Every day, society generates an estimated 2.5 billion gigabytes of data—everything from corporate ledgers to individual health records to personal Tweets.
By Michael Nova M.D.
To describe me as a health nut would be a gross understatement. I run five days a week, bench press 275 pounds, do 120 pushups at a time, and surf the really big waves in Indonesia. I don’t eat red meat, I typically have berries for breakfast and salad for dinner, and I consume an immense amount of kale—even though I don’t like the way it tastes. My daily vitamin/supplement regimen includes Alpha-lipoic acid, Coenzyme Q and Resveratrol. And, yes, I wear one of those fitness gizmos around my neck to count how many steps I take in a day.
I have been following this regimen for years, and it’s an essential part of my life.
For anybody concerned about health, diet and fitness, these are truly amazing times. There’s a superabundance of health and fitness information published online. We’re able to tap into our electronic health records, we can measure just about everything we do physically, and, thanks to the plummeting price of gene sequencing, we can map our complete genomes for as little as $3000 and get readings on smaller chunks of genomic data for less than $100.
Think of it as your own personal health big-data tsunami. Continue Reading »
One of the most intriguing elements of the new era of cognitive computing is the development of brain-inspired technologies. Those are technologies that mimic the functioning of the neurons, axons and synapses in the mammal brain with the goal of interpreting the physical world and processing sensory data: sight, sound, touch and smell. Today’s IBM Research Cognitive Systems Colloquium at IBM Research – Almaden is focusing on this realm of the cognitive computing world. Please come back for frequent reports and updates, and join the conversation at #cognitivecomputing. Continue Reading »
One of the most intriguing research projects at the Almaden lab over the past decade has been the development of a neurosynaptic microchip modeled on the workings of the brain. Funded since 2008 by the U.S. Defense Advanced Research Projects Agency’s SyNAPSE initiative, a team at Almaden led by Dharmendra S. Modha created not only a radically new chip architecture but a new approach to creating software applications.
Tomorrow, their work begins the transition from a science research project to a technology that’s on its way into the commercial marketplace. Continue Reading »