The future of Linux updates
What to Do
If you only have one machine, there isn't a whole lot you can do to reduce the size of upgrades. If you have two or more machines (running the same software), you have a number of easy tricks to reduce the number of downloads and time spent on updates. Most update software grabs updates via HTTP, which means you can use a web proxy to handle requests and cache the data. Of course, you will have to change the default configuration significantly, allowing for files of up to 100MB (or more) and having a very large cache size of several gigabytes. The advantage of this is that you can install a transparent proxy server, such as Squid , and not have to modify the configuration on any systems.
Another effective strategy is to mount your updates directory from a central server. With an RPM-based system, you must be careful and ensure that you only share the directory with the actual packages (i.e., /var/cache/yum/updates/packages/). If you share the /var/cache/yum/updates/ directory, for example, the various systems might get upset about sharing files like filelists.xml.gz.sqlite because such files are not designed for concurrent access with multiple systems. On my main server, I simply have NFS enabled with an /etc/exports containing the following:
/var/cache/yum/base/packages *(rw,no_root_squash) /var/cache/yum/updates/packages *(rw,no_root_squash)
Now wait a minute: Anyone can mount these directories and write to them as root?
RPM provides end-to-end security in the form of signed packages, so if you have GPG checks enabled (gpgcheck=1) in yum.conf, you will find out quickly if anyone tampers with a package. The advantage of letting everyone write to this central directory is that if a client starts an update and downloads the packages, the packages are then available for all the other machines, including the server.
Now you finally have slipstreamed installs. Rather than installing the operating system and then applying the updates, you can create custom install media with the updates already included. For RPM/Anaconda-based installs, you can accomplish this with the use of the createrepo  command to create new files (usually contained in the repodata directory on your install CD or DVD). Simply copy the install .iso image, copy the new packages onto it (and get rid of the old ones because you'll probably need the space), run createrepo, and burn a fresh CD or DVD so you have install media with up-to-date packages.
- Binary diff/patch utility: http://www.daemonology.net/bsdiff/
- Courgette: http://dev.chromium.org/developers/design-documents/software-updates-courgette
- Ksplice: http://www.ksplice.com/
- Ubuntu Ksplice package: http://packages.ubuntu.com/jaunty/ksplice
- Squid: http://www.squid-cache.org/
- createrepo: http://createrepo.baseurl.org/
MSBuild is now just another GitHub project as Redmond continues its path to the light.
Malware could pass data and commands between disconnected computers without leaving a trace on the network.
New rules emphasize collegiality in coding.
Upstart lands in the dust bin as a new era begins for Linux.
HP's annual Cyber Risk report offers a bleak look at the state of IT.
But what do the big numbers really mean?
.NET Core execution engine is the basis for cross-platform .NET implementations.
The Xnote trojan hides itself on the target system and will launch a variety of attacks on command.
Spammers go low-volume, and 90% of IE browsers are unpatched.
Adobe scrambles to release patches for vulnerable Flash Player.