What Have We Learned?
Paw Prints: Writings of the maddog
The Ohio Linux Fest is September 25th to 27th, and one of their major themes is the 40th "birthday" of Unix. Two of their major speakers will be addressing this topic, Dr. Peter Salus and Dr. M. Douglas McIlroy. Why is forty years of Unix important?
During a chat session one night, one of the chat members said that they did not think the topic of the 40th year of Unix was relevant to anyone. I want to take this time to respectfully disagree.
We tend to take for granted various things around us. Things that surround us at birth we think of as "natural". We tend not to question running water, sewer systems, electric companies, railroad tracks or other things that have "always" been here. Yet at one time they were new and innovative technologies.
For example, at one time the railroad companies of the United States did not use rails that were a standard distance apart. Therefore each time a train came to the terminus of one railroad, they had to unload the cargo, carry it to the terminus of the other railroad, reload it onto the cars of the other railroad, and then travel on that set of tracks. An "innovation" of standard gauge rails allowed the cars of one railroad to travel on the tracks of another railroad.
Forty years ago many computers still did not run an operating system of any type. Many systems ran one program at a time, and the operators would have to load a stack of cards or a magnetic tape and "release" the program to be run. A plethora of different operating systems came from different manufacturers, with different programming interfaces.
Digital Equipment Corporation had over ten different operating systems for their line of PDP-11 computers, and another nineteen or so operating systems produced by other companies. Each operating system had different characteristics for different job loads. RT-11 was single user "real time". RSX-11 was real-time, but could also be multi-user, and evolved into IAS timesharing. RSTS/E was also timesharing.
Digital was not alone in this sea of operating systems. Many companies had several operating system offerings on their equipment.
In 1969 the development of Unix began in Bell Laboratories. Driven by some fairly basic guidelines (and perhaps somewhat guided by the available equipment), a relatively simple and clean kernel was developed. Above that was layered a series of libraries, relatively portable across various hardware architectures and crowned with a command interpreter called a "shell".
The idea of the command line of Unix which we take for granted today was not simple to conceive. The concept of small programs doing one thing very well, with a simple set of options, and utilizing the output of one command to feed the input of another was something that needed to be explained and modeled....so much so that Doug McIlroy, the person who conceived of the idea, had to write the first few Unix commands to illustrate what became known as pipes and filters.
The need for multiple programs in the command line drove the need for multi-tasking and lightweight processes from the beginning. The fact that Unix systems were timesharing systems also drove the need for stability and security.
The movement of Unix to multiple hardware architectures drove the separation of the kernel into device independent parts and device dependent parts, which in turn allowed for more stability as the device independent parts were re-used time and again on different hardware architectures. It also drove the need for a very simple, efficient and flexible language known as "C", which itself is a success story that has survived generations.
Forty years later we have an operating system that spans everything from very small embedded systems to very large mainframe computers, from tightly coupled SMP systems with monolithic memories to NUMA memories and loosely coupled systems.
While some may argue that the Linux kernel is not "simple" by the standards of forty years ago, a lot of the kernel interfaces remain the same, or have minimal changes.
And what we have learned is not limited to the technology. Close to forty years ago Ken Thompson decided to take his "fun project" and start sharing it with universities. People started researching and collaborating, exchanging source code and immediately making those changes useful to their "customers". Ken (coming from a research background) knew that you do not have to write all the code yourself to have things move forward.
One other thing we learned is that technology does not always win at the marketplace.
While there may have been other operating systems of the time that had more "functionality" or were more completely "architected", the ability to (relatively) easily license the Unix source code and to port that to your hardware is what allowed companies like Sun Microsystems to become (in a few short years) a threat to companies like Digital Equipment Corporation.
Today the same strategies are in place, but in record numbers, for eighteen years ago a young college student decided to do a project "just for fun" and collaborated with people to create the Linux kernel.
Linux threatens proprietary operating systems because of the "ease" of licensing it to various hardware platforms. In the beginning it might be argued that there were better operating systems, but the ease of licensing the software drove adoption, which drove cooperation, which drove technology, and so the same "Unix" collaboration that moved slowly with tape and serial line speeds now moves with Internet speed.
Android is a threat to Apple's iPhone, not only because it is good software, but because it is easily (and freely) licensable to many hardware manufacturer's phones, giving a larger market and (eventually) better access to applications.
At this year's Ohio Linux Fest Doug McIlroy is giving a talk on why "small and simple" is still appropriate in the days of multiple gigabyte main memories on the desktop and multiple gigahertz processors in the system. Due to a prior commitment I can not be there to hear him, and while I can guess at a lot of what he might say along these lines, I will miss the chance to hear a truly great mind talk about issues still relevant today (forty years later) in a way that most people will understand.
In 1993 a small group of Unix programmers met at an event at Disneyland. We started talking about why we liked to program on Unix, and in 1994 this became a book called "The Unix Philosophy" by Mike Gancarz (one of the people there that night). It is these philosophies that are relevant, and why celebrating forty years of Unix has meaning for all of us.
Finally, we need to keep Free and Open Source Software in front of everyone, all the time. We need to celebrate everything, all the time. It is why Software Freedom Day (held Saturday, September 19th) is important, and why we need to keep talking about Free Software all the time. Certainly the advocates of closed source software are untiring, and have great resources to push their agenda. The unique opportunity this year to celebrate the forty years of a truly great operating system which so much of our own software is modeled on is something to show to non-users.
unix roxUnix is great, and must be praised. I switched to Linux a few years after Windows 95 came along. Right when Microsoft customers finally plunged into the world of graphical interfaces, I switched from DOS to this environment where command-line had evolved much more, and applications were supposed to talk to each other instead of do everything by themselves. The philosophy behind the creation of Unix and all other Unices after it is as much important to the greatness of Linux as the facts that it's a free and a gratis system.
Kernel king admits his tone has alienated volunteers, but says the demands of the process require directness.
New flaw in an old encryption scheme leaves the experts scrambling to disable SSL 3
Lennart Poettering wants to change the way Linux developers talk to each other.
Enterprise giant frees itself from ink and home PCs (and visa versa).
Mozilla’s product think tank sinks silently into history.
TODO group will focus on open source tools in large-scale environments.
New tool will look like GParted but support a wider range of storage technologies.
New public key pinning feature will help prevent man-in-the-middle attacks.
Carnegie Mellon researchers say 3 million pages could fall down the phishing hole in the next year.
The US government rolls new best-practice rules for protecting SSH.