"There's a lot that could be done better," said Steve Mann, a University of Toronto professor who has been developing his own high-tech glasses since the 1970s.
Last month, Google released a promotional video for Google Glass, a pair of glasses with a built-in semi-transparent computer screen in one corner. The video shows the device responding to voice commands to allow users to do things such as record videos while hang gliding, look up photos of tigers while sculpting ice into the shape of a tiger's head, or checking the weather while walking the dog.
The winners of a Google contest will have the opportunity to be the first to buy the device for $1,500.
While Google Glass may look like brand new technology, Mann said he first developed computerized eyeglasses in the 1970s. Unlike Google's, they were designed to enhance vision, by allowing the user to clearly see things that are normally too bright to look at, while at the same time allowing them to see things that are too dark to look at.
"If you're in a dark alley and there's car headlights shining in your face, you still want to be able to see the face of driver and recognize the license plate number, for example," he said.
He sees a lot of other uses for high-tech glasses in the future such as:
- A phone that allows people to do video calls that allow them to see from the perspective of the people they are calling.
- A wearable face-recognizer that could help people with memory problems.
- A device that helps blind people see by mapping directly to their brains.
Isabel Pedersen, who researches wearable and mobile media culture at the University of Ontario Institute of Technology, acknowledged that wearable computing has existed for decades, but she said the arrival of Google Glass is still significant because it makes the technology available to consumers for the first time.
"It's the mainstreaming of this kind of device," she said. "I really think we're on the cusp of a big change with things like Google Glass."
That's because such devices will allow people to do things such as surreptitiously shoot video, which has implications for privacy, public safety and personal freedom, Pedersen said.
Mann's own experience illustrates a possible scenario that could arise while someone is wearing computerized glasses.
Like Google Glass, Mann's glasses contain a camera that feeds into a computer. The computer reprocesses the image and presents an enhanced version to the user's eyes. In the process, it records what the user sees.
Mann recalled one time when he was hit by a car while walking with his glasses on. Although the computer attached to the glasses was designed to write over older data as it records, in this case, the car damaged it, causing it to stop recording. That preserved clear video of the driver, car and licence plate number.
"In some sense, the driver recorded himself," Mann said.
Certainly, it's not something that the driver would have expected.
That kind of recording "is something that really interests and scares all of us," Pedersen said.
But she noted that it's not such a big step from the almost ubiquitous recording done now with security cameras and smartphones.
Pedersen expects new policies to arise to deal with that issue and others raised by wearable computing.
Some of those issues are apt to be discussed at a conference Mann is organizing in Toronto in June called the "Social implications of wearable computing and augmediated reality."
Augmediated reality — a term used to describe some of the computer-enhanced experiences that wearable computing allows for — is in some ways similar to virtual reality, except that it theoretically doesn't interrupt real life, Pedersen said.
"In practice, it could be very distracting to have your contacts texting you while you're down the street," she said.
That remains to be seen and dealt with.
In the meantime, Pedersen said, "the task at hand for Google is to socialize the public to wearing this device before anybody sees it and rejects it."