BanderasNews
Puerto Vallarta Weather Report
Welcome to Puerto Vallarta's liveliest website!
Contact UsSearch
Why Vallarta?Vallarta WeddingsRestaurantsWeatherPhoto GalleriesToday's EventsMaps
 NEWS/HOME
 AROUND THE BAY
 AROUND THE REPUBLIC
 AMERICAS & BEYOND
 BUSINESS NEWS
 TECHNOLOGY NEWS
 WEIRD NEWS
 EDITORIALS
 ENTERTAINMENT
 VALLARTA LIVING
 PV REAL ESTATE
 TRAVEL / OUTDOORS
 HEALTH / BEAUTY
 SPORTS
 DAZED & CONFUSED
 PHOTOGRAPHY
 CLASSIFIEDS
 READERS CORNER
 BANDERAS NEWS TEAM
Sign up NOW!

Free Newsletter!

Puerto Vallarta News NetworkTechnology News 

Enhanced Vision Creates 'Sixth Sense'
email this pageprint this pageemail usMarlowe Hood - Agence France-Presse
go to original
April 09, 2010



The new technology could have a range of applications from personal navigation to corporate security. (Getty Images)
The Internet, GPS and state-of-the-art eye tracking technology could be combined to enhance everyday experiences.

Picture this: As your eyes alight for the first time on a skyscraper in a foreign cityscape, a disembodied voice whispers in your ear the phone number of a posh bar on the top floor.

Or this: You have been spotted on the street by an old friend whose name suddenly eludes you. But even before there is time to shake hands, a glance at your smartphone reveals her identity and the date of your last encounter.

Welcome to the world of augmented reality, the here-and-now enhancement of everyday experience through virtual, interactive technology.

Prototypes of both of these applications - based on the novel use of eye-tracking tools - were presented last weekend at the inaugural Augmented Human International Conference.

Over two days, engineers and scientists gathered in the French Alps ski resort of Megeve unveiled cutting-edge research on boosting human perception with information from the Internet, customized databases, or even biofeedback from our own brains.

A team of researchers from the Telecommunications Research Center in Vienna decided to take a state-of-the-art eye tracker designed for web-use analysis out of the laboratory and onto the street.

They hooked up the device - with one camera trained on the user's eye, and another on the scene being observed - to a smart phone with a built-in compass and global positioning system (GPS), to get a fix on the user's orientation and location.

They added sensors that show whether one was looking up or down, and attached the whole kit - designed to navigate urban landscapes - to a bicycle helmet.

Closing one's eyes for two seconds triggers a request for information about the building, bridge or monument in view.

A remotely accessed computer scans geo-referenced databases on the Internet such as Google Earth, and then forwards the result back to the user's cell phone, closing the loop.

"We wanted to make the system as non-intrusive as possible, so we used a text-to-speech engine. Data is received through an ear piece," explained Matthias Baldauf, one of the researchers. "It should be like a sixth sense."

A representative from a major international oil company, asking that he not be identified, said the application could be useful for security training or work on oil platforms.

"We consider this to be a transformative technology," he said.

Another "proof-of-concept" invention presented in Megeve - functional, but a long way from commercial development - adapted eye-tracking technology as a memory aide.

Rather than training a camera on the eye, the "Aided Eye" system developed by a team from the University of Tokyo uses tiny infrared sensors.

While less accurate, additional data about eye movement and the frequency of blinking make it possible to pinpoint a face or a book cover within a field of vision.

And rather than matching the object to content on the Internet, the program devised by the scientists draws from a hand-tailored database of images and files, sometimes called a personal lifelog.

"For the experiment, we registered 100 images for the database," explained Yoshio Ishiguro from the Graduate School of Interdisciplinary Information Studies.

"When the eye trained on an object, it was recognized by the computer and a corresponding file was extracted," he said.

The system is light enough to be mounted onto a pair of reading glasses, but researchers have still not figured out how to provide the wearer feedback.

A tiny screen embedded inside the glasses or an audio system are both options, Ishiguro said.




In accordance with Title 17 U.S.C. Section 107, this material is distributed without profit to those who have expressed a prior interest in receiving
the included information for research and educational purposes • m3 © 2009 BanderasNews ® all rights reserved • carpe aestus