By Ben Goldhirsh
With major technological advancements coming at an accelerated pace, success in our increasingly global economy depends more and more on intellectual property assets.
Patents, copyrights, and trademarks play a vital role in the economies of developed countries – in fact, intellectual property (IP) has been a key factor in the initial development of developed economies. Increasingly, emerging markets are seeing the value of fostering and keeping their own IP to help spur innovation, and provide both large and small firms with technologies that will drive success. This creation of competitive products and services that results from intellectual property ownership benefits not only consumers but society and the economy as a whole. Continue Reading »
By Paul Michel
America’s Founding Fathers considered patents important enough to put the patent right to exclude into the U. S. Constitution. In Article 1, Section 8, they listed patent protection above even the establishment of an Army and Navy. Their sequencing of priorities for Congress to address was not accidental, but reflected their plan for transforming the new nation from a poor, agrarian, former colony into the wealthy, independent, industrial and commercial power it became.
So in April, 1790, the first Congress enacted the first Patent Act. Over the next two centuries, the Act was amended and strengthened regularly, because successive Congresses observed industrialization and economic growth all around them, as under the Founders’ system, the United States went from importing nearly all manufactured goods to itself manufacturing all the products it needed and prospering as a major net exporter.
Within just a little over one hundred years, America surpassed all other nations in wealth and technology, partly because of its strong patent system, aided by wide oceans, abundant natural resources, and universal public education. Throughout the 19th century American inventors outpaced their counterparts elsewhere. During the 20th century, the American patent system helped stimulate the computer revolution as well as astonishing advances in medicine, including creation of whole new fields, such as bio-technology. After a slump in the 1970s, when Japan replaced America as the leading maker of consumer electronics, in the last two decades of the century our nation regained its rapid growth and technological leadership. Continue Reading »
By Adam Mossoff
The America Invents Act (AIA) was signed into law in September 2011, and it is rightly recognized as “the most significant reform of the U.S. patent system since 1836.” The AIA’s provisions are not even fully implemented yet — its ink isn’t even dry, as we used to say in the analog world — but people are calling for more changes and reforms to the patent system.
This push for change is largely due to the widespread belief today that the “patent system is broken,” a trope repeated in many hyperbolic newspaper accounts and blog postings. The din about the “broken patent system” has become so incessant that USPTO Commissioner David Kappos recently stated in a speech at the Center for American Progress, “Give it a rest already. Give the AIA a chance to work. Give it a chance to even get started.” Continue Reading »
By Bernard Meyerson
As IBM’s chief innovation officer, I’m especially proud to reveal today that the company has accomplished a remarkable achievement: It has been awarded the largest number of United States patents for the 20th year in a row. IBM’s scientists and engineers racked up 6,478 patents last year, and nearly 67,000 patents over the past two decades.
The sheer number and diversity of these patents matters. It shows that a lot of truly novel thinking is going on at IBM’s global research and development labs in a wide variety of fields—from nanotechnology and computer systems design to business analytics and artificial intelligence, and beyond.
Yet volume alone doesn’t tell the whole story. What good are a pile of patents if they don’t change the world? That’s why we set our research priorities and make our investments with the goal of producing maximum global impact.
Today, we’re focused on a new era in Information Technology that is now in its early stages, but one that will continue to roll out over the next two decades. We call it the era of cognitive systems. We believe that the benefits of this new era will arrive sooner and stronger if companies, governments and universities adopt a culture of innovation that includes making big bets, fostering disruptive innovations, taking a long-term view and collaborating across institutional boundaries. That last part is crucial. What’s needed is radical collaboration—large-scale efforts to find common cause and share resources, expertise and ideas across the borders between companies and institutions.
Innovation isn’t about “me” anymore—one person, one company, or even one country. It’s about “we.”
Computers have tremendous capacities for storing information and performing numerical calculations—far superior to those of any human. Yet, when it comes to other capabilities, including creativity. computers are woefully inferior to people. But a young IBM Research scientist, Lav Varshney, believes that before too long computers will indeed be creative.
His work concerning the sense of taste and food recipes, which we featured in our IBM Next 5 in 5 predictions last week, was highlighted on National Public Radio’s All Things Considered broadcast on Christmas Day.
By Steve Hamm
Patience is one of the most important virtues a researcher can possess, but some scientific pursuits require an almost preternatural calm in the face of monumental challenges. Case in point: quantum computing.
Scientists have been trying to grasp this holy grail of computing ever since Nobel Prize-winning physicist Richard Feynman in 1981 challenged the scientific community to build computers based on quantum mechanics. Matthias Steffen, the manager of an experimental quantum computing project at IBM Research, believes the key to persevering in a project like this is keeping an open mind. ” You can’t stubbornly keep pursuing a path because you’re invested in it personally,” he says. “Take a breather, and be open to making changes in your approach–potentially drastically”
Our scientists dreamed up the IBM Next 5 in 5, our 2012 forecast of inventions that will change your world in the next five years. The focus this year is on the coming era of cognitive systems, and how computers will mimic the senses. You voted. And the winner is Sight: A pixel will be worth a thousand words. The prediction garnered 37% of the votes, edging out Hearing: Computers will hear what matters.
In case you missed seeing the entire list, here are the other predictions:
Join the Twitter conversation at #ibm5in5.
By Richard Silberman, Writer/Researcher, IBM Communications
Without Lubomyr Romankiw, building a smarter planet would be much more difficult, if not impossible. Personal computers, smart phones, digital cameras and DVRs may have taken much longer to become a reality. ATMs, the Internet, Blue Gene and cloud computing might still be far off fantasies.
The world as we know and enjoy it today – with its ubiquitous computers and data-storing devices – is almost unimaginable without the magnetic thin-film disk storage technology and the read-and-write magnetic head that Dr. Romankiw and Dr. David A. Thompson invented at IBM 40 years ago.
The thin-film magnetic recording head is the tiny component that reads and writes data in virtually every disk-based storage device made since 1979. Before Dr. Romankiw’s inventions of thin-film heads and the processing technology to fabricate them, data storage for even the most cutting-edge computers was cumbersome, slow and expensive.
By Bernard Meyerson
It’s amazing when you look back over the 60+ years of the computing revolution and see how far we have come in such a relatively short time. The first electronic programmable computers, built in the 1940s, were essentially really fast electronic calculators. Then came the mainframe, the PC, the Internet and social networking. Today, we’re entering the era of cognitive computing–machines that help us think.
IBM’s Watson marks a turning point. The former Jeopardy! TV quiz show champ is now reading millions of pages of medical text in preparation for going to work in healthcare. But while Watson can understand all manner of things and learns from its interactions with data and humans, it is just a first step into a new era of computing that’s going to produce machines that are as distinct from today’s computers as those computers are from the mechanical tabulating devices that preceded them. A host of technologies are coming that will help us overcome our limitations and will transform the way we interact with machines and with each other.
One of the most intriguing aspects of this shift is our ability to give machines some of the capabilities of the right side of the human brain. New technologies make it possible for machines to mimic and augment the senses. Today, we see the beginnings of sensing machines in self-parking cars and biometric security–and the future is wide open. This year, we focused the IBM Next 5 in 5, our 2012 forecast of inventions that will change your world in the next five years, on how computers will mimic the senses:
Touch: You will be able to reach out and touch through your phone
Sight: A pixel will be worth a thousand words
Hearing: Computers will hear what matters
Taste: Digital taste buds will help you to eat healthier
Smell: Computers will have a sense of smell
Join the Twitter conversation at #ibm5in5. Click here to vote on the coolest predictions, and check back on the blog Dec. 21 for the results.
By Scott Burnett
Director, Global Consumer Electronics
Ever since I graduated from college in 1981 and began my professional career by selling analog magnetic recording tape to movie studios and music companies, the promise of digital technology has been in the wind. That was the year of the IBM Personal Computer. Each technical advance since then–the CD, the DVD, laptops, mp3 players, interactive TV, smartphones, tablets—has helped enable a convergence of computing, communications and entertainment for the consumer.
Today, finally, the long-anticipated digital convergence is fundamentally in place. Thanks to innovations in cloud computing and mobility people have just about all the computing, communications and entertainment we want where and when we want it.
But it still isn’t easy to meld all of these capabilities together. Some possibilities are more difficult to fulfill than they should be. For instance, how many people can do something as seemingly simple as setting up a recording on their TV DVR when they’re not home using their smartphone?
Continue Reading »