The hacker approach: the development of free licenses
"Information wants to be free"
In order to understand the free culture and free licenses, it is often useful to go back in history and look at the pioneers of early computing along with their views as well as the development of the hacker ethic (which is the major factor behind the formation of free culture) and the 'hackerdom' or hacker culture. One of the best insights to this lost world is the Hackers: the Heroes of the Computer Revolution by Steven Levy.
A side note: while the term 'hacker' has regrettably been viewed as controversal by general public in recent times, its original roots are pretty clear. The Jargon File, a major source of historical terminology of the field (the printed form of which is known as the "New Hacker's Dictionary"), perhaps has got the most exhaustive definition. In short, a hacker is (mostly but not necessarily) a computer professional with innovative mindset and a passion for exploration. The File also gives a good all-round definition of the hacker ethic:
"The belief that information-sharing is a powerful positive good, and that it is an ethical duty of hackers to share their expertise by writing open-source code and facilitating access to information and to computing resources wherever possible."
As described by Levy, it all began at the Massachusetts Institute of Technology back in the 1950s, when "computer science" sounded much alike to "rocket science". There were but few people who had seen a computer, and even distinguished math professors were rather skeptical about 'computing machines' (Levy recalls an episode where a MIT student failed his math exam for solving the task using a computer - the professor was positively sure that a machine cannot solve the problem correctly). Even at the cradle of modern computing, the MIT, there was no computer science as a separate discipline and the first professors were employed at the Department of Electrotechnics.
The MIT Tech Model Railroad Club (TMRC) had been founded in 1946, and by the end of fifties there was already a strong subculture formed around it. Out of the Signals & Power Subcommittee (people who dealt with electricity and wires rather than modelling tasks) came the first hackers. It is interesting to see that there was a sort of hacker culture even before there was a computer to hack on – only in 1959 there were the first courses on computer science and the TX-0 computer was obtained which is considered to be the first hacker machine. In 1961 the MIT obtained PDP-1 (later going to PDP-6 and PDP-10) which became the central device for the forming hacker culture, later formalised into the Project MAC and the famous MIT Artificial Intelligence Laboratory.
This world produced the first generation of hackers like Richard Greenblatt, Bill Gosper, Peter Samson, Richard Stallman and other 'angry young men' who "went where no man has gone before" (after all, Star Trek is a long-time classic in hacker culture). And this world had no place for business and intellectual property. Some reasons for this were the following:
- computing was for the elect few (the famous sentence "the world needs perhaps four or five computers"), there was no critical mass for a market to emerge;
- most projects had state funding due to having more or less military undertones - the generals wrote the rules, the scientists had to obey them. And for dealing with finances, the universities had bookkeepers - hackers did not have to do that. Thanks to skillful management, the bureaucracy was kept separate and the creative minds were given ample space to work.
- the resources were scarce - every mainstream PC of today is many times more powerful than the tools of the hackers of old.
- software was almost always unique to any given computer, any transfer to another machine needed major rewrites.
All this created an atmosphere of creative and original intelligence that Richard Stallman has called 'playful cleverness'. Similar units were also created at Stanford (SAIL) and a number of other universities in the US. All these shared similar ethical views - information is something that has value only when distributed. Artificial obstacles to spreading information are evil and must be removed. And due to the scarcity of resources, work results must be shared in order to prevent duplicate efforts.
RMS vs Business
The end of the 70s and the beginning of 80s brought along gradual shift towards business - the user base (or market) had grown wide enough and software had become much easily portable. Unix from 1969 is widely considered the first portable system - and while it was free for several years due to anti-monopoly legislation of the US (AT&T was entitled to make money only with telephone and telegraph, not software!), the 1984 reform of the AT&T removed most limits. And in 1981, the IBM PC was born.
An interesting detail - IBM did not 'protect' the details of PC well enough and many companies started to build them. While at first seeming to be a major loss (no license fees), the opposite was actually true. Due to the "attack of the clones", the PC reached the level of universal domination which it has retained up to now. Apple Macintosh and others (which came to the market a bit later) were arguably better designs, but the train had already left.
Likewise, the 'hacker paradise' at MIT came to an end in early 80s when its staff was split between two competing commercial enterprises (LMI and Symbolics). The resulting conflict almost emptied the Lab, one of the last ones to leave was Richard Stallman (dubbed the last of the true hackers by Levy - the title proved to be inaccurate). He was deeply unhappy with the outcome and in some years, decided to start the GNU project – a complete rewrite of Unix operating system which would be distributed freely, remaining true to the MIT hacker tradition.
While the project did not reach its main goal (or at least has not reached yet), it produced a number of important utilities and system software as well as the legal backbone of today's free and open-source software, the GNU General Public License (GPL) which had the user rights as starting point (so being totaly different from corporate End User License Agreements exemplified by Microsoft and others). Yet he was viewed by mainstream IT as a curiosity or a hopelessly hippie-minded idealist.
In 1991, inspired by a small Unix variant called Minix, Linus Torvalds (then a student at the University of Helsinki in Finland) started a new operating system project which was soon labelled Linux. Some months after the start of his project, Torvalds changed his system's license to GPL, making it a suitable pickup for all disgruntled hackers who were discontent with the proprietary, closed systems of the day (especially Microsoft's DOS and Windows, but also Apple's MacOS and various commercial variants of Unix). The system started to develop as a collaborative effort empowered by the widely spreading Internet. The hacker spirit came out of the academic enclaves where it had been forced by the proprietary model.
There are two major factors which have greatly contributed towards the rapid development of Linux. First, its birth fell into the period of the explosive growth of Internet (especially the emergence of the WWW). Second, a number of crucial components were already developed by the GNU project and became to be used in the Linux distributions. The latter fact has created bad blood between the camps of Stallman and Torvalds, most notably as the GNU/Linux naming controversy.
Return of the hackers
Shortly after Linux, the three major free flavours of BSD Unix - FreeBSD, OpenBSD and NetBSD - were born. In 1994, Red Hat – the first large-scale commercial venture using open-source model – was founded. 1995 added a set of server technologies which made setting up an Internet server several times less expensive – collectively known as LAMP (Linux, Apache, MySQL, PHP/Perl/Python). In 1996-97, the new system got a pair of advanced graphical user interfaces in GNOME and KDE. Many free software projects - notably OpenOffice.org, Firefox, GIMP and others - have found their place in other platforms as well. And Apple built its new-generation operating system, MacOS X, on the base of a free variant of Unix called Darwin.
The new millennium brought along many interesting developments – not only in software (OpenOffice.org, Ubuntu Linux) but even more the extension of the hacker model into other fields. A good example is Wikipedia – a community-built, freely editable encyclopedia. There are music companies based on open models (Magnatune) as well as publishers (Lulu.com). In 2001 MIT, the original home of the hackers, launched the OpenCourseWareiv initiative to provide free access to learning materials. The hacker ethic keeps going strong into the XXI century.
The GNU General Public License
Developed by Richard Stallman and his colleagues in 1989 and refined in its second version two years later, it forms the mainstay of free software, being used at approximately 70% of all free software projects. The version 3 will probably be officially announced later this year.
The GPL has authorship as its focus, stating the author's right to his/her work as the first thing (therefore the criticisms of being anti-author are unfounded). After this however, the GPL takes a radically different path comparing to proprietary licenses. Instead of exact dictation of conditions of use, it proclaims the rights of the user:
- the right to use, copy and distribute the work for any purpose (including business)
- the right to study the work - for software, demands inclusion of the source code
- the right to modify the work and develop new works based on it
- the right to distribute the derived works under the same conditions
The last right sets the principle of copyleft. While initially coined by Stallman as a joke on intellectual property (copyright, all rights reserved became copyleft, all rights reversed), it has been a main idea behind free culture. And while being different from the traditional approach, it is in fact base on the same idea of author's rights - therefore, violating GPL should be treated no differently than typical 'piracy' cases.
In practice, it means that GPL-ed software can be freely copied, changed and redistributed provided that
- the same rights are excercised on derivatives - therefore, it is not possible to use GPL-ed software to develop a proprietary product.
- the source code must be available - usually it is distributed with the work, but the license also permits on-demand distribution. In this case, the software must include clear guidelines how to obtain the source (e.g. by e-mail or web).
Some more remarks:
- if a GPL-ed product is changed only for 'in-house' use, publishing the changes is not mandatory. However, as soon as the derivative will be distributed outside, the source must be made available.
- GPL directly forbids any discrimination on use - this also means that all commercial use is fully legal. Both original and derived software can be sold (done by e.g. Red Hat). However, the most used business model for free software is probably "free product, commercial know-how" (support, consulting, warranty).
- GPL has a clause about the work being distributed 'as is', i.e. having no warranty. Many superficial readers have therefore grasped the idea that 'free software has no warranty' and consequently dismissed it as unsuitable for 'serious business'. Actually, the market for these services is completely open (as opposed to e.g. MS Windows where only certified partners can offer it) - anyone is fully entitled to offer services on free software, including warranty on his/her own terms. Due to this, support contracts are likely more affordable due to open competition ("you want that much for supporting my server? I will go up the street to David's shop!").
The copyleft nature of GPL is very unpleasant to Microsoft and other big players who cannot use free software authors as unpaid labour force for developing their proprietary applications, but is a good guarantee for continuity. Therefore, the opinion "wait and you will see - they soon ask money for Linux too!" is at best incompetent. Also, GPL is very resistant to a widespread but unethical practice named Embrace, Extend and Extinguish (the best example of its - fortunately quite unsuccessful - attempt is probably the infamous MS-HTML).
GPL and the other free licenses
Richard Stallman and his Free Software Foundation use the GPL as a sort of measuring stick to evaluate other free licenses. The evaluation can be seen here. The proponents of the more moderate Open Source movement tend to be a bit more liberal in license evaluation too - their approved license list can be found here.