Why Choose Linux?
The Linux operating system was developed and released for distribution as open source freeware. When it was first introduced, many in the IT community dismissed the project as being, at best, an academic pursuit with little real world application; at worst, it was a gimmick undertaken by nerds with too much time on their hands.
However, almost two decades later, Linux based systems and applications are seeing worldwide and ever-increasing use by government agencies, universities, insurance companies, auto manufacturers, ecommerce sites, and many others. Almost incredibly, Linux has even stolen a small but growing desktop market share from mighty Microsoft!
The widespread deployment of Linux systems has spawned a corresponding need for Linux training. Accordingly, many IT training programs have been expanded to provide at least some degree of specific Linux training. In fact, more and more companies are now looking for employees whose IT training includes a Linux Certification. Currently there are three major organizations providing a curriculum leading to a Linux Certification: Linux Professional Institute, CompTIA (Linux+), and Red Hat.
Although it is now clear that Linux has become a major, highly competitive alternative to the traditionally bundled commercial systems, some may still be puzzled as to how this has occurred. To understand why an entity might prefer open source Linux to a bundled commercial system, one must consider the key factors that would influence the choice.
Linux is free; on the other hand, commercially available products come with a (often hefty) price tag. “You get what you pay for” is a well-known marketplace truism that might promote suspicion of freeware. However, this truism applies more to products priced lower, rather than to products that are actually free, since a free product always includes the admonition “it doesn’t cost anything to try it”. Particularly in a down economy, this is an extremely powerful inducement to investigate a product.
It had long been thought that freeware products developed under a casually collaborative effort of programmers who are donating their time must perforce be of lesser quality than commercial products developed in full accordance with rigorous, time-tested and industry-approved procedures, standards, and test requirements, as one would surely expect from a company like Microsoft, for instance. However, after decades of widespread use, the stability and reliability of Linux systems have been shown to compare well to, and perhaps even surpass, that of similar commercially available systems.
There might be many reasons for this. For instance, one might suppose that a programmer who is voluntarily donating his time would have a strong personal interest in the work itself. Such individuals are often called “true believers” and are predisposed to producing work of the highest possible quality. Moreover, the expectation that one’s work will be scrutinized by other true believers must surely be a powerful additional motivation to do the best possible job; one’s reputation might be at stake!
On the other hand, a typically anonymous programmer working in a team environment at one of the large code factories can generally expect to be paid the same, at least in the short term, whether the end product is wonderful, adequate, or even junk. Moreover, in such an environment, even if a programmer writes the best code imaginable he would probably still find it difficult to distinguish himself. He would have little reason, therefore, to be personally concerned with the quality of the end product, beyond what is necessary to satisfy his immediate superiors.
One must therefore surmise that product quality at a commercial entity is not so much a natural byproduct of the desire and labor of individual programmers as it is an enterprise goal dependent upon the efficacy of established development procedures and upon how well the development team has followed them, as well as on the applicability and thoroughness of corresponding test programs. In effect, a commercial entity attempts to guarantee a minimum level of quality by setting a bar defined by its procedures and tests.
However, such a quality system contains implied hazards: the stated procedures may not be the best; they may not be well suited to a particular project; and strict adherence to predefined development procedures may have the effect of stifling creativity and innovation. Moreover, once the bar is achieved there is generally little motivation for pursuing further quality gains, even if such were believed possible.
Although some may try to argue the point, it seems undeniable that the huge, all-encompassing operating systems developed by Microsoft and other commercial entities perform better, in most quantifiable respects, than do Linux systems. However, institutions searching for a system that will provide a better “bang for the buck” are often not put off by this.
Consider, for example, an expensive, high performance roadster versus a more moderately priced, subcompact vehicle. Unsurprisingly, the high performance roadster will perform better in most respects. However, for many purposes the subcompact performs, if not as well, then at least acceptably well, and at far less cost.
Admittedly, this analogy is loose. Microsoft, in particular, sells different tiers of bundled operating system software, with each tier priced accordingly, in order to better match an entity’s needs. Theoretically, an entity should purchase the lowest tier that meets its needs. This is all well and good. However, with each tier comprising as much functionality as possible, a Microsoft buyer may still find that a good deal of his system capability is not being used.
On the other hand, Linux users can download a small, simple, core system and then download additional capability at need. This almost guarantees that quite a good performance fit is eventually achieved and it is free of charge.
That being said, it is probably true that no amount of add-on capability will transform a subcompact into a roadster, and if a roadster is truly what is needed then a commercial product might perhaps be the better solution.
4. Ease of installation and use
The Linux system has been described as consisting of the Linux kernel supported by a patchwork of add-on functionality. Detractors claim that this “patchwork functionality” makes Linux more difficult to install and administer. The implication here is that a bundled commercial system is almost plug-and-play, whereas assembling a Linux-based system from a patchwork of available downloads and then integrating these accordingly will be much more work.
There are a couple of half-truths to be recognized here. In the first place, even bundled, installing a commercial operating system is far from plug-and-play. Considerable IT expertise will be required to configure the system properly.
Secondly, Linux can be downloaded in preconfigured sets enabling relatively simple installation and use. Moreover, Linux is open source, which means that the sets can be rearranged as desired or needed. Although IT experience is required, nevertheless Linux is designed to make this a practical and relatively simple process. And if a simple “mix and match” rearrangement does not suffice, it is also possible to modify the source code directly.
It is true that, as of the time of this writing, Linux lacks a built-in GUI comparable to Windows. Consequently, novice users may find Linux more difficult to use. It is hoped and expected that this situation will soon change. However, although there are now many GUIs available that can run in a Linux environment, there is as yet no consensus in the Linux community with regard to choosing a standard.
Another objection to freeware products in general concerns the issue of support. After downloading the Linux source code from the Internet, where can one go for additional assistance, if such is required?
Support for Linux is readily available from a large number of sources. As one might expect, being open source, Linux is heavily documented. Accordingly, there are many “How-To” books that will help the new Linux user get started. More general works also exist, from the “For Dummies” and “Idiot’s Guide To” variety to very advanced texts written for the Linux administrator, as a visit to the local bookstore will quickly confirm.
In addition, specific questions can be asked and/or more general issues discussed in the numerous, thriving Linux newsgroups and forums. Users may also wish to subscribe to one of the many magazines devoted to Linux.
Because Linux is open source, there is a perhaps natural concern over the issue of system security. Doesn’t “open source” necessarily lead to an increased risk of system vulnerability? This question has been hotly debated in the IT community. It appears that, in terms of security, open source is both a weakness and a strength.
Obviously, exposing the inner workings of a system to a hacker grants him the opportunity to find a weakness. On the other hand, since the system is exposed to everyone, not just the hacker, if an exploitable weakness does exist it is very likely that someone else will see it as well and will call attention to it before any harm can be done.
Contrast this situation with that of a hacker who has gained access to a bundled, locked down application whose source code has been seen by only a relatively few individuals. It has been claimed that therein lies the greater risk since, arguably, a system vulnerability would be more likely to escape the notice of a few individuals than it would the entire Linux community.
At heart, open source Linux is a public service project. Its purpose is the development of a simple, effective, and highly robust operating system that will be available free of charge to all. It will provide an open platform for innovation and it will never be subject to vendor lock down.
These are certainly worthy goals. There are even some who say that such projects may one day bring the world together. As to this, we can only hope.