By Cody P. Schafer
When the Linux kernel was first released by Linus Torvalds in 1991, it set in motion a fundamental shift in software development. Today, more than two decades later, Linux is the primary open source operating system that serves as the foundation for much of the technological innovation that consumers and businesses leverage daily.
As a member of the newest generation of Linux developers, I’m in a very unique position and at a very interesting time for the tech industry.
As the technological underpinnings of our world have increasingly come to rely on Linux, more and more companies have come to understand the value of the open source model and have built dedicated efforts around the projects.
In fact, the commitment that IBM has made to Linux is a good reflection of the industry as a whole.
(This week IBM furthers that commitment, with the announcement of a new $1 billion investment in Linux development, including the OpenPOWER project, as well as the opening of a new Power Systems Linux Center in Montpellier, France, and the new Linux on Power Development Cloud.)
Two attributes that have propelled this acceptance and adoption are the flexibility and openness of the development process itself. In the Linux world, for example, any person can get a new feature to the operating system, so long as they can convince the community that it is useful and properly written.
By contrast, in the Windows world, only Microsoft is developing the kernel and only they can add globally-available features to it. The same goes for Apple, and even the various proprietary Unix iterations. With the Linux kernel, there are thousands of developers from different companies all working on the pieces they care about.
Such support gives systems manufacturers the ability to develop – even before the hardware is released – new interfaces and drivers that enable their hardware and get those new systems working better for their customers faster.
And in this age of Big Data and cloud, where issues of performance, bandwidth, and efficiency are becoming increasingly critical, these capabilities are enabling organizations to respond with far greater agility. To that end, the development and refinement of virtualization technologies continues to flourish.
To be sure, virtualization has become a requirement for any hardware that is going to be used in large server deployments like data centers. For example, IBM’s POWER processor has long had robust virtualization support through PowerVM. We’re now taking that even further and contributing to the Kernel based Virtual Machine (KVM), the virtualization technology for the Linux kernel.
With KVM we are able to take advantage of all the work done on Linux to make it faster and more reliable, and turn around and make our virtualization of systems (which often run Linux themselves) even faster and more reliable.
As the system hardware advances, we’ve ended up with pieces of our computers and servers that are tuned for doing specific work quickly. We have processor instructions designed to do encryption, for example, and devices that can act as reprogrammable hardware – all of which is built directly into the system itself. But for both of these examples, we need some way to take advantage of them in applications, and that’s what the Linux kernel provides.
Beyond the new ways that Linux is being utilized to create more powerful hardware and software, the operating system and kernel have entered a time of stability. Because of this maturity, it’s safe to say that development on Linux will likely continue to be an important piece of the developer’s pedigree and a foundation of the next era of data center innovation.
Personally, I like it because it’s easy, available and accessible. Plus, it’s a thrill to play around with stuff that runs small things like lights and phones, all the way up to the world’s stock exchanges and supercomputer installations. So, the experience is not only valuable for career consideration, but it’s a satisfying hobby, as well, to geeks like me.