Stanford Student Hones 'Gaze-Based' Computer System
- By Paul McCloskey
- 08/27/07
A Stanford University researcher has developed a system that advances "gaze-based" computing, enabling a person to use eye movements to interact with computers and surf the Web.
The EyePoint system, developed by Stanford doctoral researcher Manu Kumar, is considered an improvement over previous attempts to create an eye-movement-controlled computer. It combines hands as well as eye movements to eliminate the false positives that come from using eye movements alone.
"Using gaze-based interaction techniques makes the system appear to be more intelligent and intuitive to use," Kumar said. "Several users have reported that it often felt like the system was reading their mind."
The system requires a user to press a hot key on the keyboard, which magnifies the area being viewed. The user then looks at the link within the enlarged area and releases the hot key, thereby activating the link.
"What is really exciting is that the processing power of today's computers is completely changing the kinds of things we can use for computer interfaces," said Ted Selker, associate professor at the MIT Media and Arts Technology Lab, told Computerworld magazine. "Things like eye tracking are using channels of communication that literally were unavailable to interface designers even five years ago."
Read More:
About the Author
Paul McCloskey is contributing editor of Syllabus.