Eye Tracking Discussion Points


This article investigates the possibility of introducing the movements of a user’s eyes as an additional input medium. It claims, “While the technology for measuring eye movements and reporting them in real time has been improving, what is needed is appropriate interaction techniques that incorporate eye movements into the user-computer dialogue in a convenient and natural way.” It also discusses, “some of the human factors and technical considerations that arise in trying to use eye movements as an input medium.”

Discussion Points:

The article explains a variety of interaction techniques such as object selection, continuous attribute display, moving an object, eye-controlled scrolling text, menu commands, and listener windows. A strong point of the article is that all these are explained and discussed in detail. The author mentions ways to implement such techniques and provides observations for most of them.

Another strong point about the article was the fact that it provided both sides of the story. It presented the potential benefits and uses of eye-tracking techniques but also provided its current limitations (for example, the “Midas touch” problem).

I once saw a video about tracking eye movements when testing videogames. In particular, the game being tested was a racing game. The testers tracked the movements of a player’s eyes on the screen, keeping track of what they were looking at during the race. They were trying to determine whether their level design had features that may distract users from concentrating on the road. They kept count of how many times the player looked at different display meters on the screen, how many times their attention was significantly diverted from their cars and also how many times these distraction resulted in a crash. Eye tracking techniques have the potential to be a great tool in the field of human-computer interaction and may prove very useful in designing and testing several applications in the future.