Article from Issue 169/2014

Updates on Technologies, Trends, and Tools

More Online


Off the Beat * Bruce Byfield

Lennart Poettering and the Cause of Civility Just what free software needed: another discussion of civility in the community in which half-truth counters half-truth, and nothing gets resolved.

LibreOffice, OpenOffice, and Rumors of Unification According to Charles H. Schulz of The Document Foundation, the unification of LibreOffice and Apache OpenOffice is not currently on anyone's agenda. However, rumors persist that a change is about to happen.

The Thankless Jobs of Community Managers When Aaron Seigo criticized the role of community manager last week, I assume he meant that he prefers not to lead that way.

Productivity Sauce * Dmitri Popov

Transfer Files from the Command Line with Need to share a file without leaving the convenience of the terminal? The service got you covered.

Prettify Wikipedia with WikiWand The WikiWand service provides an alternative Wikipedia interface which is both polished and streamlined.

Paw Prints * Jon "maddog" Hall

I Will Never Again Talk about the Benefits of Free Software I have been talking about using "Free Software" for the past 20 years, and many times I have had people ask me, "Why do you use Free Software?"


How the Spanning Tree Protocol Organizes an Ethernet Network * Hannes Kasparick

For your network network to run smoothly, you might need to make important decisions about the Spanning Tree protocol.

ADMIN Online

Port Knocking * Chris Binnie

To ensure proper accessibility of data on your computers, I look at combining TCP Wrappers and port knocking.

Save and Restore Linux Processes with CRIU * Tim Schürmann

With CRIU, you can freeze and save the current state of a process, then bring it back to life and continue on your way.

Nagios Passive Checks * Alessio Ciregia

Why spam yourself with useless notifications every time a script completes successfully? You can use Nagios to screen the notices and just send the ones that need action.

Live Kernel Update Tools * Martin Loschwitz

Two projects by Red Hat and SUSE – Kpatch and kGraft – attempt to patch the kernel with security updates on the fly. We look at features in these two tools and their suitability for production use.

Leading Open Source Coder Rants about Rants

Red Hat developer Lennart Poettering, who currently works on systemd but has also been involved with projects such as PulseAudio and the Avahi zeroconf implementation, has posted a scathing critique on the state of communication in the open source development community. Poettering denounces the caustic and disrespectful tone used by some open source developers, who are accustomed to trading insults and taunts in their blog and newsgroup posts that could easily be called verbal abuse if someone said them in person.

Poettering writes, "the Open Source community is full of assholes, and I probably more than most others am one of their most favourite targets. I get hate mail for hacking on open source. People have started multiple 'petitions' on petition web sites, asking me to stop working (google for it). Recently, people started collecting Bitcoins to hire a hitman for me (this really happened!). Just the other day, some idiot posted a 'song' on YouTube, a creepy work, filled with expletives about me and suggestions of violence. People post websites about boycotting my projects, containing pretty personal attacks."

Poettering blames Linus Torvalds for setting the caustic tone embraced by so many open source developers. Torvalds' insults and denunciations are legendary, and Poettering's blog post lists a few of the most extreme examples. Poettering engages in a bit of his own negative hyperbole when accusing Torvalds of setting a disrespectful tone: "A fish rots from the head down." Still, he admits it isn't all about one guy. "But it is not just Linus, it's a certain group of people around him who use the exact same style, some of which semi-publicly even phantasize about the best ways to … well, kill me."

As Poettering admits, he, and others who survive in the open source community, must develop a thick skin. Still, as he points out, the real problem is, "it's not an efficient way to run a community. If Linux had success, then that certainly happened despite, not because of this behavior."

Internet Giants Launch Collaboration to Improve Open Source

Several prominent open source companies have announced a new collaboration to improve and simplify open source software for corporate environments. The new group, which is called TODO (for Talk Openly, Develop Openly) says the challenges of working with open source are "ensuring high-quality and frequent releases, engaging with developer communities, and using and contributing back to other projects effectively."

Founding members for the organization include Box, Dropbox, Facebook, GitHub, Google, Khan Academy, Square, Stripe, Twitter, and Walmart Labs. The founders all work with open source tools individually and have founded TODO to share experiences, share code, and collaborate on new ideas for deploying open source tools on an enterprise scale. According to the website, the member organizations will be "sharing experiences, developing best practices, and working on common tooling."

Details on the focus and structure of the organization are scarce at this point. Members say more announcements will follow in the coming weeks. In true open source spirit, the TODO website invites other organizations to participate, stating: "We can't do this alone. If you are a company using or sharing open source, we welcome you to join us and help make this happen."

Bash Shellshock Bug Causes Attacks Around the World

The Internet community was shocked with the September 24 announcement of a major security bug affecting the Bash shell. The Shellshock bug (originally called Bashdoor) was first discovered on September 12 and was assigned the CVE identifier CVE-2014-6271, but the news was embargoed until the 24th so major OS vendors could prepare security patches.

The Shellshock bug causes Bash to execute commands stored in specially crafted environment variables. The widespread use of Bash as a command shell for Linux and other Unix-based systems, and the importance of Bash as a tool for managing server systems on the Internet, has caused some security experts to predict that Shellshock will cause far more damage than the much-analyzed Heartbleed bug that dominated the news earlier this year.

After the initial announcement, Shellshock continued to fill the tech headlines. Major Linux distros announced patches, and even Apple announced concerns for their Unix-like Mac OS X systems. More difficult to fix are the many network routers and appliances that are running some form of Linux with Bash installed for maintenance and configuration. Oracle and Cisco announced dozens of appliances potentially could be compromised by Shellshock, and in many cases, the patches would need to be developed and installed individually. (Consult your vendor if you have a device that might be affected.)

By the end of the week, attacks were already appearing in the wild, and attackers deployed scanners to seek out systems that might have the Shellshock vulnerability. On September 29, security vendor Incapsula announced that it had deflected 217,089 exploit attempts on over 4,115 domains with new attacks arriving at a rate of 1,970 per hour.

What to do now? Patch your systems ASAP!

Mozilla Labs Shuts Down

Word has trickled out that Mozilla Labs, the think tank behind innovations such as the Mozilla Popcorn web video and social networking tool, has disbanded. The move apparently happened quietly earlier this year, but the story recently went public with the blog post of former Mozilla Labs developer Ian Bicking. Although the Mozilla Labs site is still live, the last post in the Mozilla Labs blog is dated December 9, 2013. The Labs group has long seemed a bit adrift from the greater Mozilla community. A search in the search bar for the blog at the official Mozilla Project blog at the Mozilla project website reveals that Mozilla Labs hasn't been mentioned in a blog post since June 1, 2010.

Mozilla has been in some controversy recently, with some Free Software advocates questioning its independence from donor corporations such as Google. In this case, however, it appears the closing of the Mozilla Labs project was not related to politics and was based more on technical and structural issues within the Mozilla community. Mozilla Labs was actually a separate organization that operated largely independently of the Mozilla project. This kind of independence, which is common in corporate labs, is created to spur innovation and free, creative thinking, but it can also cause problems.

According to Bicking's blog post, "Labs groups are often criticized for being too separate from the companies they belong to. In Mozilla this was a problem because we had a hard time getting things done – for instance, if the success of a project depended on changes to Firefox, then it was hard to get those prioritized. By working separately, we also would often use patterns that weren't liked by other people. People in Labs would often come from the perspective of web developers, where much of Mozilla is focused on user agents, and this is a much bigger divide than I initially appreciated. But the technical problems were perhaps a symptom: integration with the rest of Mozilla was viewed as a problem, a late-stage effort, not part of the exploration and experiment itself."

As Bicking points out, Mozilla Labs should not be confused with Mozilla Research, which is still going strong. Mozilla Research is home to projects such as Mozilla Rust and ASM.js. The difference, according to Bicking, is that Research focuses on foundational web technologies and Labs was all about building new products.

The former head of Mozilla Labs, David Ascher, has moved to a role of Vice President for Products at Mozilla. The Mozilla Foundation appears to view this change as restructuring for greater efficiency and productivity, rather than as a refocus of the foundation's priorities.

HP Splits into Two Companies

HP CEO Meg Whitman has confirmed the rumor that the company is planning to split into two different companies. According to the announcement, the server, storage, software, and corporate services division will split into a separate company known as HP Enterprise. The PC and printer division, which will primarily serve the consumer sector, will also fend for itself and retain the name HP Inc. Whitman herself will stay on as CEO of HP Enterprise, and the current PC and printer exec Dion Weisler will become the CEO of HP Inc.

The announcement is especially surprising (or some might say ironic?) considering the row that started when former CEO Leo Apotheker announced plans to spin off the PC and printer business three years ago. At the time, the outrage resulting from the announcement of a split caused a board room coup that eventually brought Whitman to power.

Still, Whitman claims that her complete change in viewpoint results from real considerations. She says the company has "… considerably strengthened our core businesses to the point where we can more aggressively go after the opportunities created by rapidly changing market." What Whitman didn't say is that the PC market has deteriorated significantly since the last time, and HP knows it needs a bigger piece of the tablet and mobile business to survive in the consumer space.

Another phenomenon coming into the foreground is 3D printing. The company knows it needs a piece of the 3D printing action if it is going to prolong its reign as the king of printers. It is also possible that the board and shareholders like this deal better than the last one – Apotheker was talking about selling the PC and printer division – Whitman has apparently worked out an arrangement where the company will split like an amoeba, with current shareholders owning shares in both companies and both sides retaining their independence.

Apotheker's reflections on the plan are not known at this time. It would not be surprising if he feels a sense of vindication for knowing the company was getting too big and ungainly to react to the changing market. Part of his problem was that he shocked everybody and they freaked out; but still, politics is part of the job description of any corporate CEO, and in that sense, the present proposal looks a lot better to the world than Apotheker's unexpected fire sale.

New Techniques Predict Malicious Domain Names

Researchers at Palo Alto Networks have devised a means for guessing malicious domain names before they are used in a malware attack. A paper titled "We Know It Before You Do: Predicting Malicious Domains," by Wei Xu, Kyle Sanders, and Yanxin Zhang, highlights a number of factors that play a role in how attackers acquire and deploy new domain names.

The authors point out that attackers don't keep a domain for long. Spam and antivirus blacklists quickly identify a malicious domain, rendering it ineffective. Attackers are therefore constantly registering new names to use temporarily. By the time the blacklists discover a name, the attackers are often ready to let the name go and move on to a new one. The overall effect is a revolving door of names that are acquired, exploited, and discarded. The authors describe a variety of techniques for predicting malicious names before they are used, including:

DGA: Attackers often use Domain Generation Algorithms (DGA) to create new domain names. The presence of DGA domain names in certain contexts indicates a possible malicious site.

Re-use of domain names: Attackers often abandon a name, then wait for it to fall off the blacklists, then re-register it at a later date.

Observing DNS queries: Certain specific patterns of DNS queries indicate the presence of an attacker searching for an available name.

Connections between malicious domains: A domain that is about to be used for an attack is often identifiable by its connections to other malicious domains.

The authors tested their techniques and found that they predicted domains that were soon to be malicious with 83% accuracy. The study offers an illuminating glimpse into the strategies used by attackers to secure and exploit malicious domain names. The authors say they will continue to look for "more connections and evidence" that suggest a name will be used for malicious purposes.

Watson Takes on Science

IBM has announced advances in their Watson Discovery Advisor cloud service, which they launched earlier this year, to bring the power of Watson to assisting research scientists. Watson is IBM's supercomputer system that became famous for winning Jeopardy. In January, IBM unveiled a US$ 1 billion investment in developing and extending Watson to find practical applications for its impressive capabilities. The Discovery Advisor is part of that effort.

According to IBM, "Building on Watson's ability to understand nuances in natural language, Watson Discovery Advisor can understand the language of science, such as how chemical compounds interact, making it a uniquely powerful tool for researchers in life sciences and other industries."

Scientists in academic or commercial research centers can deploy Watson to analyze and test hypotheses rapidly using data in millions of scientific papers. To win at Jeopardy, Watson read and organized data from thousands of books and other written documents, and programmers successfully taught it to extract useful information to address specific questions. Applying the same principle to science, the Watson team is focused on turning Watson loose on the thousands of academic papers written on a specific subject. More than a million scientific papers are published every year. IBM quotes the National Institutes of Health with the observation that "a typical researcher reads about 23 scientific papers per month, which translates to nearly 300 per year, making it humanly impossible to keep up with the ever-growing body of scientific material available."

Watson, on the other hand, can consume all the available information and look for connections and correlations that might not occur to a human. The press release quotes the example of a study by the Baylor College of Medicine and IBM, in which scientists used Watson technology embedded in the Baylor Knowledge Integration Toolkit to analyze 70,000 scientific papers on a specific protein related to many cancers and identify six proteins that appear promising for further research.

Berners-Lee Says the Web Was Built for Simplicity, not Security

In his keynote address at the IPExpo Europe conference, World Wide Web creator Tim Berners-Lee defended his decision to keep the web simple and not build in additional security features.

According to a report in The Register, Berners-Lee said he wanted to make the World Wide Web easy for developers to use and program for, and that if he hadn't done so, "The Web might not have taken off if it had been too difficult."

The web began as an effort to merge hypertext technology with DNS-based Internet networking. Security depended on the underlying security of the network – much as it does to this day. The whole notion of a protocol stack is that upper layers rely on services provided by the layers below, and when Berners-Lee built the first web server and client, he had no way of imaging the web would one day need anything more.

The Reg (and other commentators) have noted that Berners-Lee's comments are in contrast to Internet co-founder Vincent Cerf, who has stated that he wishes that security had received more attention during the development of the underlying TCP/IP protocol system.

Berners-Lee did say he supports always-on HTTPS cryptography for web connections, and he said he strongly supports the need for more privacy on the web, stating "The idea that privacy is dead is hopelessly sad."

Buy this article as PDF

Express-Checkout as PDF
Price $2.95
(incl. VAT)

Buy Linux Magazine

Get it on Google Play

US / Canada

Get it on Google Play

UK / Australia

Related content

comments powered by Disqus
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Subscribe to our ADMIN Newsletters

Support Our Work

Linux Magazine content is made possible with support from readers like you. Please consider contributing when you’ve found an article to be beneficial.

Learn More