300 issues of Linux Magazine
Doghouse – 300
Looking to the future of technology and expertise.
Two months ago, the staff at Linux New Media alerted me to the fact that this issue was the 300th issue of Linux Magazine, a milestone in Linux journalism, with my column for the issue to be submitted in August, the month of my 75th birthday.
Of course, I am tempted to wail about the history of data processing (as we used to call it) as it changed to "computer science" from "computer black magic," or how I never really got into computer games because writing every program and solving every problem was a game for me. However, I put aside those temptations and instead write about the future as I see it.
I started programming with systems that were physically large and expensive and logically small and slow relative to what we have today. Many systems had no operating system at all – your code ran directly on the hardware and the device drivers were linked into your program, not the operating system. Networking was non-existent and computer security was locking the door at night.
Through the next 20 years the computers became physically smaller, faster, cheaper, logically larger, and more complex with multiple CPUs and cores. Operating systems were becoming more capable and feature-full.
Also being developed were graphical processing units (GPUs) for graphical workstations and then even personal computers.
Client-server computing came about with local networking followed by the evolution of clustering for high availability and scalability, and high-performance computing, which was helped considerably with the advent of Beowulf-style supercomputers.
Cloud computing came about (many "old timers" grumbling that this was just client-server computing renamed) with the promise of inexpensive, very expandable computation and storage.
Of course, none of this was completely linear in development. Many companies and groups developed many aspects of computer science and engineering in parallel, mixing and matching these technologies. Computing devices (the tablet, cell phones, "wearable" mixed-reality headsets) also changed the computing scene.
Now we are on the cusp of several new technologies, as well as emphasizing some technologies that have been around a while.
The first of these is the RISC-V architecture: a clean Reduced Instruction Set Computer (RISC) technology with a core set of instructions so small it is reasonable to implement them in an emulator or FPGA. No licenses to pay, no huge sums for the right to implement the architecture, it shows much promise in new processor technologies.
Security gets more concerning day by day. Coming from a time when operating systems didn't even support passwords, it's somewhat breathtaking to need two-factor (or even multi-factor) authentication due to the Internet (and remote hacking) – requiring even more study and work on security needs.
Along with security comes changes in encryption and its cousin, authentication. From a time when networking was done in clear text across the Internet, we developed encrypted streams, then Virtual Private Networks using encryption and authentication to protect people's communications. This was helped dramatically by RSA encryption, implemented by Pretty Good Privacy (PGP) and its open source implementation (GPG). This was fairly widely implemented with things like key-signing ceremonies and "fingerprints" printed on business cards. Over time the keys were changed from 1024-bit implementations to 2048-bit implementations to 4096-bit implementations as computers became more powerful and able to decrypt the weaker versions.
Unfortunately another new technology, quantum computing, has the capability of decrypting data in the time frame of minutes that would take modern-day conventional supercomputers centuries to break. This is why entities are reacting to create quantum-resistant algorithms and practices to strengthen networking and data storage, with data to be decrypted and re-encrypted with the stronger algorithms – a big task.
With all of this as a threat, quantum computing will provide new computing capabilities to solve problems we would not even consider tackling with conventional computing.
Now the skeleton in the closet is what most people call artificial intelligence (AI). I have followed AI from the early days of the CSAIL program at MIT. I admit to not spending huge amounts of time investigating AI, but I know enough about it to know that current technologies are far from being "cognizant" and capable of abstract thinking. To me they are very elegant, very complex self-programming expert systems, and I wrote a complex expert "systems administrator" program back in the 1990s.
Will even the current version of AI reduce the need for many white-collar jobs? Without a doubt. Will it become HAL of the movie 2001: A Space Odyssey? Not so soon, because the human mind is orders of magnitude more complex and magnificent.
In the meantime, continue learning and making yourself more valuable. For years I have told people that an expert is simply someone who knows more about a topic than you do. Be that expert.
Pax Vobiscum
Buy this article as PDF
(incl. VAT)
Buy Linux Magazine
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Subscribe to our ADMIN Newsletters
Support Our Work
Linux Magazine content is made possible with support from readers like you. Please consider contributing when you’ve found an article to be beneficial.
News
-
Fedora 43 Has Finally Landed
The Fedora Linux developers have announced their latest release, Fedora 43.
-
KDE Unleashes Plasma 6.5
The Plasma 6.5 desktop environment is now available with new features, improvements, and the usual bug fixes.
-
Xubuntu Site Possibly Hacked
It appears that the Xubuntu site was hacked and briefly served up a malicious ZIP file from its download page.
-
LMDE 7 Now Available
Linux Mint Debian Edition, version 7, has been officially released and is based on upstream Debian.
-
Linux Kernel 6.16 Reaches EOL
Linux kernel 6.16 has reached its end of life, which means you'll need to upgrade to the next stable release, Linux kernel 6.17.
-
Amazon Ditches Android for a Linux-Based OS
Amazon has migrated from Android to the Linux-based Vega OS for its Fire TV.
-
Cairo Dock 3.6 Now Available for More Compositors
If you're a fan of third-party desktop docks, then the latest release of Cairo Dock with Wayland support is for you.
-
System76 Unleashes Pop!_OS 24.04 Beta
System76's first beta of Pop!_OS 24.04 is an impressive feat.
-
Linux Kernel 6.17 is Available
Linus Torvalds has announced that the latest kernel has been released with plenty of core improvements and even more hardware support.
-
Kali Linux 2025.3 Released with New Hacking Tools
If you're a Kali Linux fan, you'll be glad to know that the third release of this famous pen-testing distribution is now available with updates for key components.

