There are several researchers who have, for quite a few years, bravely walked the walk and gone out into the world as cyborgs, wearing various generations of enhanced eyewear that allows them to see a net-connected cyber-reality as they roam about in the physical world. Perhaps the first to do this with a vengeance has been Steve Mann, now at the University of Toronto. At first his technology was large and ungainly, and when wearing his gear he looked like something from the Star Trek Borg. But through successive generations of refinement, his current eyewear is much more sleek:
In complementary work, Thad Starner at The Georgia Institute of Technology has been conducting research for many years related to the idea of walking around in one’s daily life while wearing an augmented reality headset. Here is a picture of Thad in his gear:
Both Steve and Thad are brilliant researchers, quite ahead of their time. While neither of them is incorporating the real-time head position/orientation tracking that would allow this to be the kind of thing we’ve been talking about, they are both looking seriously at the sociology of wearing a display and integrating it into one’s daily life.
Yet the thing that strikes me about both of these set-ups is that they interfere with eye contact. In both cases, you cannot look directly into the pupil of the person wearing the head-mounted display — the pupil is hidden by the display mechanism, which is literally right in the way.
I could be wrong, but something tells me that this is a show stopper for widespread adoption. Most people, when looking at another person face to face, want to see their eyes. It may or may not be true that the eyes are the window into the soul. but I suspect that retaining the ability to see other peoples’ eyes will be necessary for widespread acceptance of an ambiscopic future.