Consumer and productivity computing

Off the Beat: Bruce Byfield's Blog

Feb 28, 2013 GMT
Bruce Byfield

I'm not sure when I started. But in the last six months or so, I've been making a distinction in my mind between consumer and productivity computing as a means of clarifying my thoughts about desktop interfaces.

This is a distinction that hardly needed to be made in the first twenty years of the personal computer. Each workstation was released with the largest hard drive, the fastest video card, and the largest amount of RAM available, and was used for every task that users had. For years, laptops were less powerful, but that had to do with convention and the limits of miniaturization more than anything else; besides, it was accepted that you gave up some of the power for the convenience of portability. But, mostly, computers were general purpose machines.

However, in the last decade, that situation started to change. Games, which had always pushes the specs, started being marketed with dedicated consoles. Then, with the rise of smart phones and tablets, significant shares of the computer market came to consist of hardware that by any spec was well behind the latest available for workstations. With this change, the distinction between consumer and productivity computing was suddenly born.

By consumer computing, I mean personal tasks that usually involve the use of third party services such as ISP providers or social sites like Facebook and Twitter -- light tasks that don't require the latest computer power. You don't usually need a large screen for consumer computing, because most of what you enter is relatively small. Unless you choose to buy an external keyboard, you can make do with a touch-screen keypad, or else laboriously tapping keys until you get the right character, because most of your messages aren't very long. Nor do you need a massive video card, because while you may send large graphics, you aren't likely to do much serious editing with a phone or tablet.

By contrast, productivity work is the sort you do in an office suite, or in a graphics or sound editor or when you are compiling software. Some productivity work can be done on a phone or tablet, but much of it is considerably more demanding. You need several terabytes of hard drive space because you may be working with a great many large files. You need several gigabytes of RAM, so that rendering each of those files doesn't take ten minutes, and you can open several of them at once. You need a large screen or multiple ones so you can do detail work, and save your eyes, because, unlike with consumer computing, you are likely to be staring at the screen for several hours at a time.

As in hardware, so in the interface
Neither consumer nor productivity computing is superior to the other one -- they are simply a matter of intent. To some degree, you can even mix them, although being productive on a phone or a tablet usually involves buying addons like USB keyboards, and playing the consumer on a productivity computer is ridiculous overkill --if anything, phones or tablets have done millions of users a service by no longer requiring them to buy computers that were vastly overpowered (and over-priced) for their relatively simple purposes.

However, just as the hardware required differs for consumer and productivity computing, so should the interfaces. A user engaged in productivity computing usually prefers to concentrate on whatever they are doing. That means that, for productivity computing, the most efficient interface is one that allows users to start new applications or open new files with a minimum of clicks or screen changes.

By contrast, a user is less concerned with efficiency when engaged in consumer computing. Any addition of text or editing of graphics they do will typically be minimal and quick; they are unlikely to get into a creative fugue simply because they won't be working long enough to do so. More often, they are involved with moving files, downloading them or sharing them with others. As a result, clicks and screen changes are less likely to disturb them the way they might if they were being productive.

The myth of convergence
This distinction, I suspect, helps to explain much of what has happened on the free desktop in the last five years. In focusng on new users, interface designers have tended to assume that they would be primarily consumers, and to ignore the fact that some new users, at least some of the time, also want to be producers.

Just as importantly, the distinction explains why being inspired by mobile devices when designing workstation and laptop interfaces is so often a mixed blessing. GNOME 3's overview mode might be fine for consumer computing, but for productivity users, it simply draws them away from focusing on their work. The same holds true for the way that both GNOME and Unity open apps full-screen by default: what is a reasonable assumption for the relatively simple uses of consumer computing is awkward for the producer.

I hesitate to use the word "never," because events have a way of making me ook like an opinionated blowhard when I do. However, if this distinction is in sync with reality, it suggests that convergence -- the common code base for all form factors that Ubuntu and others are pursuing -- is not an obtainable goal in any practical sense.

Although a common code might be imposed, in practice it is unlikely to be satisfactory everywhere that it is used. Either consumption or productivity is favored over the other, or, worse, so many compromises are made that the interface will be unsatisfactory to a greater or lesser degree on all form factors. The most that can be hoped is to share as much common code as possible.
For better or worse, modern computers are more specialized than the personal computer that we knew until recently. However, until that change is recognized, usable interfaces are probably going to be more a matter of luck than design expertise.

comments powered by Disqus
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Subscribe to our ADMIN Newsletters

Support Our Work

Linux Magazine content is made possible with support from readers like you. Please consider contributing when you’ve found an article to be beneficial.

Learn More

News