The Eyes Have It
Paw Prints: Writings of the maddog
The last couple of days the press has been reporting on the glasses from Google, and the press seems to think that this is something new. Far from it.
As early as 1980 at MIT “Cyborgs” roamed the halls.
One of the more famous of them was Steve Mann, who now works at the University of Toronto. When Steve first started his research into “wearable computing” he was a bit strange looking, wearing a hat with rabbit-ear antennas coming out of it, a large backpack to carry the equipment and a huge “heads up” display over one eye, but over time as the equipment became smaller, lighter and more powerful he eventually could hide the equipment and all the wiring under a suit-coat.
Another pioneer in this area was Thad Starner, currently at Georgia Tech. I was lucky enough to have Thad visit the Greater New Hampshire Linux User's Group (GNHLUG) in the early days of Linux to talk about his work at MIT before he moved to Georgia Tech. At that time Thad's “heads up display” and equipment were considerably smaller than Steve's early gear, and he could hide most of it in a side pouch, or (during the winter) in a long trench coat. The use of a one-handed chorded keyboard called a “Twiddler” allowed for a less obtrusive use, although when had his hand in the pocket of the coat you could never tell if he was reading his email or doing something else with his hand, and both of which would be a little annoying if you were trying to have a conversation....
In any case both Steve and Thad had many interesting ideas that they were projecting for their systems.
Steve was actively interested in ethics and privacy concerns and would often confront employees in stores who would object to him capturing video of everything they did, while store security cameras captured video of everything that Steve did....
Thad was (and is) interested in the systems being able to be a link between people using American Sign Language and the hearing population.
Both of them talked about the various uses of information that was not only available all the time, but where you were and what you were doing would pull up pieces of information that you might need at the time.
Personally I have been wearing eyeglasses since the age of twelve. While they help me see better, they do not have some of the capabilities that I would hope for this type of heads-up display.
For instance, the only time my glasses prescription is “perfect” is usually in the first couple of hours after I walk out the store where I picked them up. Then my eyes start to change again and sooner or later I need a new prescription and a new set of glasses. It would be great if “Google Glasses” would measure my eyes and adapt the image to fit my current “prescription”. I do not know if this is possible, but I would welcome it.
“Digital zoom” might give me both telescopic vision and microscopic vision. Infrared cameras might allow me to see things at night, or through rain and fog that I might not otherwise be able to see.
In construction, being able to overlay architectural drawings on a wall so you can see where the pipes and wiring were (and hopefully are) running through that wall.
Of course there are privacy issues as Steve was pointing out in the 1980s. If I go into a toilet I might want to turn off the camera, and others might want to know that the camera is turned off. Perhaps a piece of black tape on one's glasses will come into vogue again, but this time not to repair a set of broken glasses, but to assure people that your camera is indeed not functioning.
People are understandably upset about the “creep factor” of having computers on all the time with applications that might be able to tell you where others are, or others where you are, but from my viewpoint these glasses are not the issue, since those applications already exist in phones. The glasses are simply an easier way to see the information, and all the more reason to use Free and Open Source Software so you know where the information is going, and can install the applications that you feel comfortable with.
In the early 2000s there was a set of eyeglasses that Thad was using. They had a small prism set up in the lens over one eye, with an LCD screen over at the side of the lens. The image from the LCD screen would be projected through the glass, hit the prism and be reflected into the eye, creating the illusion of a translucent computer screen floating in front of your face. When the LCD screen was off, the glasses reverted to a “normal” prescription set of glasses, or in the case of people with perfect vision, a set of sunglasses.
Unfortunately at that time the glasses were only a prototype, and would have cost 500,000 dollars a pair. The inventor estimated that if they were in mass production, they probably would not have cost that much more than a standard set of prescription glasses. I feel that the true cost would have been somewhat higher than a prescription set of glasses, but not horribly so. The issue was the high volume needed to reduce the price, and he could not get any glasses company to manufacture them. All the companies kept saying “We are not in the computer peripheral business.”
I told Thad that he should have said to the manufacturers “How would you like even people with perfect vision and contact users wearing your glasses?”...but by that time it was too late.
I am glad to see that Google is moving this forward. I am tired of using one of my two hands or my one lap to hold my display.
Occupare oculus!comments powered by Disqus
Competitors get in the game with RHEL without Red Hat
Security researchers have already notified Microsoft; some fixes are available
The company is collaborating with Google and Intel to use Kubernetes as an engine for Fuel
Customers can take a free test drive of SLES for HPC on the Azure Cloud
San Francisco-based chip company announces their first fully open source chip platform.
The whole distro gets rebuilt on glibc 2.3
Ubuntu Vendor tries to solve app packaging and distribution problem across distributions.
Founder of ownCloud launches the Nextcloud project.
Will The Machine change the way future programmers think about memory?