Looking back on ten years of GNU/Linux
Off the Beat: Bruce Byfield's Blog
Last week, I suddenly realized that I had been using GNU/Linux for ten years. Both the operating system and I have seen some changes since then -- largely for the better, but one or two for the worse.
The exact date was July 5, 1999. That was the day I started work at Stormix Technologies, a pie-in-the-sky Dot-com company that had so little chance of ever being profitable that sometimes wonder if it was intended as a tax write-off. I was a marketing and communications consultant, and the first non-programmer hired by the company. I had tried GNU/Linux once or twice, but, in those far off days, the Live CD was still a couple of years in the future. My main qualification was that, as a refugee from the collapse of OS/2, I had no fondness for Windows.
Looking back, I have to smile when I think of pundits and users who claim that GNU/Linux isn't ready for the desktop. They should have tried it ten years ago.
Back then, you couldn't take for granted that all the hardware at your workstation would work. If anything, the chances were that it wouldn't. To make matters worse, online resources to research what you needed beforehand were almost non-existent. If you were a newbie like me, you asked around to find what worked, and hoped for the best when you installed. If you wanted to dual-boot, you needed to use Partition Magic beforehand, although a young Australian student named named Andrew Clausen was about to change that by releasing GNU Parted.
Back then, KDE was considered the best desktop for beginners (and GNOME was considered experimental and unfinished. Both were basic and blocky-looking. The more adventuresome graphical interface users might look at a window-manager, and Enlightenment had a cult following among users.
As for basic desktop tools, forget it. The closest thing to an office suite was Applixware. KOffice was still struggling towards its first general release, and Abiword was a glorified text editor. When Sun Microsystems announced that it was making StarOffice free for the download, people were so desperate that they ignored the non-free license (and old-timers complained about the introduction of bloatware to the system). In the same spirit, I was so desperate for software in which to write the Stormix manual, that when Adobe released a short-lived beta of FrameMaker about six months after I joined the company, I leapt at the chance to use it. I was so desperate that, when people suggested that I consider LaTeX, I seriously considered it despite the learning curve involved. At first, in the rush to get a product out the door, I relied on Windows products (and felt a hypocrite, believe me).
The same lack of choice also existed for online tools. If you weren't interested in command line tools, the only choice was the Netscape suite, which was showing its age. Mozilla was gearing up, but it would be at least fifteen months before anyone considered using it for production.
All in all, so many basic applications were missing that, when I became product manager, it made sense to ship the first box set of the Stormix distribution with demos of Win4Lin and VMware. Our customers, we realized, were going to be pioneers, and were going to need some temporary assistance.
The truth is, in 1999, developers had barely started thinking of GNU/Linux as a desktop operating system -- and many preferred not to think of it as one at all. Shortly before, Caldera had surprised everyone by introducing the first graphical installer. The other major players were Red Hat, SuSE, and TurboLinux, and everyone was talking about the changes that desktop-oriented distributions like Stormix and Corel were planning.
What else? At Debian, a serious discussion was under way about whether or not to include KDE, because its Qt toolkit didn't have a free license at the time. The first free software-related IPOs were taking place, and the community was divided between those who saw the record first day trading of companies like Red Hat and VA Linux as a major milestone in recognition and those who saw it as a sellout. The split between open source and free software was a recent memory, and, in many people's minds, still a raw one. When Richard Stallman arrived in town to speak. the local user's group had a ferocious debate about whether GNU/Linux should appear in its name (okay, some things don't change).
All this was confusing, overwhelming, complex -- and very intoxicating. At times, I despaired of learning all I need to know and the narrow path I had to chart between producing a commercial product while keeping the good will of the community and trying to introduce GNU/Linux to a large audience.
But I never doubted that the atmosphere was exciting, and far more worthwhile than anything else I could be doing. Between the fast pace of change in the free software community and the sense that I was part of a pioneering movement, I never doubted that I had become part of something much larger than me.
Slowly, as I gained understanding, I found my perceptions changing. Even when running OS/2, I had accepted that the operating system would be customizable only in a very circumscribed way to a casual user like me. But, running GNU/Linux, I soon learned to take for granted that I should have the ability to edit configuration files. I learned that the command line, instead of being a mystery, could be a source of empowerment. I got used to assuming that, if I needed software, I could download it in a few minutes, and not have to run to the store and pay for it. Slowly, too, I moved from being an open source pragmatist to a free software idealist who would prefer doing without to using proprietary software. You might say that my whole idea of my relationship to a computer was changed.
On a personal level, too, I found myself spoiled by the excitement of the free software world. When I left Stormix (one of the first rats to desert a sinking ship), proprietary companies no longer interested me, and seemed faintly immoral as well. When I could, I worked for companies like Progeny Linux Systems and Maximum Linux. When I couldn't, I suffered, and plotted to move on -- surviving not very happily until Robin Miller at Newsforge and Linux.com agreed to take a chance on me, and I discovered that I could make a living writing about free and open source software.
Things have improved out of all recognition since the first day I booted GNU/Linux. Almost all the limitations I discovered ten years ago are gone, and thousands of people are using the operating system today who never would have considered it ten years ago. But that maturity and acceptance, I can't help thinking, have come with a lessening of passion and idealism. While some people a decade ago went to ridiculous lengths at times to prove their ideological purity, today, people seem too willing to compromise, using proprietary tools because they are easy to find rather than supporting the production of free alternatives.
Still, the cause remains the same, and today's disputes are not that different from those of a decade ago. Only the ground has shifted. What is being discussed today is not the upstart of ten years ago, but an operating system in first maturity and a movement that, despite some serious issues, has managed to keep much of its integrity and idealism.
Looking back, I sometimes think that free software and I have found our stride together. I don't know what the next decade holds for either of us, but I can't wait to find out.comments powered by Disqus
Mozilla’s product think tank sinks silently into history.
TODO group will focus on open source tools in large-scale environments.
New tool will look like GParted but support a wider range of storage technologies.
New public key pinning feature will help prevent man-in-the-middle attacks.
Carnegie Mellon researchers say 3 million pages could fall down the phishing hole in the next year.
The US government rolls new best-practice rules for protecting SSH.
Klaus Knopper announces the latest version of his iconic Live Linux system.
All websites that use these popular CMS tools could be vulnerable to denial of service attacks if users don't install the updates.
According to a report, many potential victims of the Heartbleed attack have patched their systems, but few have cleaned up the crime scene to protect themselves from the effects of a previous intrusion.
DARPA and NICTA release the code for the ultra-secure microkernel system used in aerial drones.