One of the hallmarks of science fiction movies — one way they demonstrate that they happen “in the future” is by using augmented reality (AR) interfaces. Minority Report is the classic trope illustration for this idea, but it’s scarcely alone — movies like Iron Man 3 show Tony Stark artfully rotating complex diagrams and creating armor schematics with a few twists of his hands and artful zoom motions. Real life AR setups have lagged behind substantially, in spite of the best efforts of Google and others. Now a team from the University of North Carolina, led by Andrew Maimone and working in collaboration with Nvidia, thinks it’s found solutions to some of the more pressing problems with current AR technologies. At present, cost, weight, and battery life have restricted AR devices to small screens with an FOV (field of vision) of perhaps 40 degrees. The problem with a technology like Google Glass, assuming you want to use it for AR, is that the tiny LCD in one corner of your vision is terrible for accurately representing objects. If you want to see something you can manipulate, you want to see it across both eyes in with a wide enough angle that it looks natural, not squashed. What Nvidia and the UNC team have created is a pair of glasses that eschew complicated optics for a simpler solution. By placing transparent point light sources capable of projecting light directly into the eye at minimal distance from the pupil, the rays of light that make up the display can be fired directly into the eye. One point light isn’t large enough to create a visible field, but a hexagonally tiled group of point lights can be effectively used to create a superimposed visual image, as is shown above. The researchers call this a pinlight display. The waveguide-based pinlight display was fabricated by a needle attached to the arm of a 3D printer. The dots are visible from a distance when light shines across the acrylic, but invisible at the extreme close-up used for projection. There are still some difficulties to be worked out and the system would function more effectively if eye tracking were incorporated, but the underlying premise is fascinating. By analyzing how the eye responds to various projector configurations, the UNC team was able to test which pinlight display configuration would create the most cohesive overlay. The team notes that “Our solution to create an evenly-toned image is to configure the pinlight projectors so that they minimally overlap to fill the focus plane and to encode a virtual aperture over the modulation plane so that the light from the overlapping regions does not reach the eye.” A video demonstrating the technology and actual results from the prototype technology is shown below.
The long-term promise
In the past, I’ve been openly dubious about the prospects or desirability of AR in a product like Google Glass. That doesn’t, however, mean AR has no future. The ability to explore intricacies of a product’s design or walk through a 3D model representation of a patient’s internal organs could be enormously helpful to surgeons of the future. It’s not too much of a stretch to think that technologies like this could revolutionize certain types of medicine — the combination of better sensors and augmented reality could allow for unprecedented levels of fine motor control and the ability to repair damage that currently requires open surgery. Offering a broad field of vision might not seem like a major advance when products like the Oculus Rift have been promising something similar for years, but there’s a huge difference between a VR headset with its own integrated display and an AR device that doubles as a pair of glasses. If this approach pans out, it’ll give a much larger group of users a chance to research what AR can be used for — and hopefully lead to something more appealing than Robert Scoble naked in the shower. The researchers will be presenting their work — “Pinlight Displays: Wide-Field-of-View Augmented-Reality Eyeglasses Using Defocused Point-Light Sources” — at Siggraph 2014 next week.