Whatever happened to... Richard Stallman?

Second-year reading week assignment by Sam Morris

Have you heard of Richard Stallman? When he decided to overthrow corrupt American capitalism in the computer industry, he quit his job... and continued coding.

—J. Random Hacker, The Code

In 1983, Richard Stallman, dissatisfied with the increasing commercialism present in the computing industry of the time, announced his vision for the future of software development and distribution to the world. Stallman was one of the last few “Hackers” of the time. Of course, that word is not meant to conjure up shady images of sci-fi-esque villains, who break into corporate networks in order to take valuable data hostage; instead I use the the term in its original meaning.

The Jargon File lists this definition under the “Hacker Ethic”:

The belief that information-sharing is a powerful positive good, and that it is an ethical duty of hackers to share their expertise by writing free software and facilitating access to information and to computing resources wherever possible.

Stallman was a strong believer in the Hacker Ethic as it is defined above, to the extent that many of the rest of the community considered him to be a radical. He believes it is immoral for someone to claim ownership of software, or to prevent end users from modifying or distributing it. The rest of the world, however, did not share his outspoken opinions.

Developers were being enticed away from university laboratories by offers of fame and fortune from corporations. With the developers went any chance of their work being released so it could be used for the benefit of all, because the corporations enslaved their employees with stringent nondisclosure agreements. Furthermore, government-funded research was required by law to be released into the public domain—the problem being that what corporations took from the commons, they did not return: their modifications and improvements remained theirs and theirs alone. Eventually the source code for the Unix operating system, once freely available to anyone curious enough to understand it became a jealously guarded secret.

GNU's not Unix!

The aim of Stallman's GNU Project (the GNU itself being one of those frightfully amusing recursively-defined acronyms that computer scientists enjoy so much) was to provide a Free operating system and the programs to go with it to anyone who wanted them. While there were other programs at the time that were freely available, the difference between them and GNU was both as small as one capital letter, as and great as the difference between capitalism and socialism.

GNU programs are released under the GNU Public License. The GPL is unlike traditional software EULAs, which try to restrict your use of a given software package by, say, removing your ability to reverse engineer the software to find out how it works. Instead, the GPL grants the user additional rights, such as that of being able to freely modify and distribute the program, source code included. The “catch” is that if you wish to distribute your modifications, you must make them available under the terms of the GPL too. This proviso ensures that modifications and improvements are guaranteed to be returned to the commons, instead of solely profiting their creator.

Of course, no one is forced to accept the GPL. To this day, software licenses stand untested in any court of law, and there is much debate about whether a license can be considered a binding contract. If a user does not agree to the terms of the GPL then their rights revert to those provided by copyright law which are legally well-tested, and even enshrined in the US Constitution. In this way, no one is denied the ability to use GNU software, but the freeloaders who would otherwise take the code and return nothing are denied from doing so.

It is important to note here that GPL software can be sold: originally, the Free Software Foundation (created by Stallman in 1995) funded itself by charging for the distribution of its products. The FSF also organised seminars and even sold merchandise to make money. Originally however, it relied on donations, which it received from companies such as Hewlett-Packard, who realised the benefits of Free software for all.

The GPL worked; in the time it would take the creators of a proprietary, commercial text editor to fix a bug and release a new version of their product, ten EMACS users could fix a hundred bugs, submitting “patches” back to the project's maintainer for inclusion in the next version. Eventually most free software projects would be maintained by a “ benign dictator”, who would decide which patches made it into the main source tree, and the direction in which the project would progress. Of course, users who disagreed with the maintainer were able to download their own copies of the project's files and do as they wished with them; this came to be known as “forking”, after the way processes in a Unix system are able to clone themselves with the fork() system call.

Time went on, and perhaps ironically as the Cold War drew to a close and Eastern Bloc countries abandoned communism as a dismal failure, Stallman's ideals spread throughout the computer science community. The GPL allowed many people to contribute to the various GNU projects, which by this stage counted amongst their number a compiler, GCC; an editor, EMACS; a debugger, GDB; a shell, bash; not to mention numerous other libraries and utilities, such as the GNU C Library, and Fileutils, such as ls and rm. In addition, many people with no direct relation to the FSF adopted the GPL for use in their own projects.

GNU software began to be used by more and more people. The low, low asking price of “free!” was attractive to universities and public-spirited corporations alike. The adoption of GCC for use in NeXT's NeXTStep operating system would prove to be of particular importance later on.

The GNU project still lacked one major component, however: an operating system kernel to call its own. The GNU system was a nomad, its programs run on top of proprietary (albeit often freely available) operating systems.

In 1990 the FSF started a new project: the GNU HURD kernel. The HURD (recursive acronyms now being so passé, this time the name sprang from a pair of mutually recursive acronyms) was to be a completely modern “microkernel” that shared very little with its counterparts. Unfortunatly, progress was slow, due to various reasons including a fixation on making the kernel absolutley perfect.

Enter the penguin

Things continued happily as they were until the end of 1991, when a Finnish second-year undergraduate named Linus Torvalds posted the following message to the newsgroup comp.minix:

Hello everybody out there using minix -

I'm doing a (free) operating system (just a hobby, won't be big and professional like gnu) for 386(486) AT clones. This has been brewing since april, and is starting to get ready. I'd like any feedback on things people like/dislike in minix, as my OS resembles it somewhat (same physical layout of the file-system (due to practical reasons) among other things).

[Full message] [Thread]

Linux, as the operating system came to be known, was a cleanroom implementation of a Unix-like operating system, made available free of charge under the GPL. As time went on, progress on the HURD slowed, and eventually all but stopped. In 1992, the Free Software Foundation announced that it would adopt Linux as the kernel for its operating system, and GNU/Linux was born.

GNU/Linux, as Stallman insisted it be called, was the combination of the GNU project programs and utilities and the Linux kernel. It quickly evolved into an efficient, powerful system suitable for use as a low-end server. It was ported from the Intel IA32 architecture to Alpha, PowerPC and even palmtops, to name a few. Commercial vendors such as Red Hat and Debian stepped in and provided “distributions” of GNU/Linux to companies, charging for technical support. They used the money they made to employ programmers to make further improvements to the GNU/Linux system. Debian in particular focused on rock-solid stability, as well as packaging thousands of GNU/Linux programs up into sections based on the license under which they were available—beleiving that users should have the right to choose to run only GPL software if they wanted to.

GNU/Linux continued to diversify; soon it could be found performing functions on both the lowliest cable modem router, and the biggest, fastest parallel-processing machines around.

Meanwhile, computers were taking off in a big way outside of companies, computer science departments and the hobbyist's bedroom. Apple Computer released the Apple Macintosh in 1984 and continued to create user-friendly computers that took the desktop publishing, digital imaging and multimedia creation industries by storm. Their high price put them out range of the average consumer, however. Instead, another company moved in with a product called Windows. While it wasn't pretty, secure, reliable or stable, Windows was easier to use than Linux, and was much cheaper than the Mac OS; its creator's predatory, and often illegal, business practices didn't harm its marketshare either. By 1995 it was firmly entrenched as the most popular computer platform of the decade.

“That” company

In many ways, Microsoft is the antithesis of the Free Software Foundation; where the FSF beleives in the user's right to modify the programs they run for their own use, Microsoft practically marks Windows with a "No user-servicable parts inside" notice; while the FSF gives its software away, Microsoft is not content with charging its customers to use its products: it wants them to pay annual subscription fees for the privildge. The FSF adopts the Unix attiude of using the right tool for the right job, while Microsoft wants people to use My Tool 98®, and is willing to sabotage competitor's products and lock users into their proprietry data formats to make them do so.

Microsoft saw the FSF as a very dangerous competitor. By the late 1990s, Linux had the capability to start displacing Windows on the desktop. Microsoft fought back hard with what could only be described as a slur campaign that (falsely) denounced the GPL as a “viral” license; they claimed that Free software could not be used by your company without being forced to open-source your software. Nevertheless, Windows NT did not quite acheive the market penetration against Unix an Linux systems that Microsoft desired.

Today, the company is designing a new computing platform, codenamed Palladium, that gives them the ability to exclude competing operating systems including Linux and FreeBSD from even running on PC hardware. Other operating systems will be unable to implement Palladium themselves, because Microsoft will own—and refuse to license—patents on the technologies concerned.


Throughout most of the 1990s Apple acted with indifference to the world of Free Software. However, in 1997 they purchased NeXT Computers, in order to use their NeXTStep OS to build version 10 of their Mac OS. Mac OS X, as it is called, is a ground-up rewrite of the Mac OS. The Mach kernel that OS X uses is available under the Apple Software Public License (ASPL), and the barebones system distribution, Darwin, has been ported to the IA32 PC. The tools that come with Darwin are derived from FreeBSD 4.4, and some of the applications are from the FSF, such as the compiler, GCC 3.1.

Although in recent years Apple have remade themselves, Stallman still says that their approach is not ideal. He believes that their adoption of an open-source development model for Darwin and other OS components betrays the fact that they view Free software purely as a way to get quicker, cheaper bug fixes. Nevertheless, if anyone can put Unix on the average user's desktop computer, it will be Apple.


Today, the FSF stands stronger than it ever has before, with the support of thousands of companies, tens of thousands of developers and hundreds of thousands of end users. Development of the HURD has recently picked up again, and Debian plan to offer a GNU/HURD operating system, alongside their existing GNU/Linux and GNU/FreeBSD offerings. Projects such as KDE and Gnome are creating easy to use, powerful desktop environments, while the Linux kernel itself is about to end another phase of development: work on the stable v2.6 series is about to begin. Companies are installing GNU/Linux desktops as an alternative to the planned obselesnce and enforced upgrades that are a way of life for Microsoft's customers. Developments in parallel computing mean that GNU/Linux can now be found in the most powerful computers on the planet.

Despite all this, Free Software's most important asset is the vibrant and thriving community of developers and users that has sprung up around it in the last twenty years.

The next few years will see the industry shift in one of three directions: we may go down Microsoft's path of subscription-based software, where Slavery is Freedom, and where users may not alter their software, only pay for it; we may end up with Apple's perhaps misintentioned but effective compination of proprietry hardware, parially open, partially proprietry software and open standards; or totally Free operating systems such as GNU may start making serious inroads into the desktop, assisted by companies like Red Hat and Debian. No matter what happens, Free software, and the ideals behind it, are here to stay.