The Use of Eye Movements in Human-Computer Interaction Techniques: What You Look At is What You Get [1]


This paper reports on the results of studying eye-tracking and how eye movement can be used to interact with a computer.


  1. The eye-tracking mechanism used, while bulky, is unobtrusive: An infrared light source is shone on one eye. A camera monitors the light reflected off the cornea and detects the location of the pupil. From this information, they calculate the direction the eye is looking (“visual line of gaze”). A servo-controlled mirror keeps the eye in camera when the head moves.
  2. Since eye movements are naturally jittery, even when we think our eyes are still, the eye-tracking system must smooth out these jitters to compute the “intent” of the gaze, rather than the actual instantaneous measurements.
  3. The “Midas-touch problem” occurs in eye-tracking systems that attempt to replace the mouse by the gaze location and execute commands when stared at.
  4. A key observation seems to be the consideration that eye-tracking is an input to the system as a whole, rather than a specific command by the user to take some action. For example, the eye-tracking location could be used to determine what commands are currently active.
  5. Using eye-tracking to select an object for information purposes, e.g., additional detail, but that does not involve an irreversible (or not easily undone) action, works well.
  6. Using eye-tracking to move objects (e.g., ships) was surprisingly effective, compared to using a mouse to drag. In part because the eye needed to target the destination position anyway, and then requiring the hand to move the mouse to that position seemed to the users like extra work. Toggling “pick up” and “put down” with a mouse button (or presumably key press) worked (or would work) well.
  7. Section 5, Observations, nicely summarizes the main results of the work.


  1. This paper was published in 1991. What has been done in the last 20 years? In particular, the standard monitor size and configuration has gotten larger (and with projection systems and video walls, larger still), which could make eye-tracking more useful than their examples of 19'' monitors (at unspecified resolutions).
  2. A strength of the paper but a weakness of the experimental setup is described in the “Observation” section. While the device is non-invasive, the servo motor that responds to head movements and dim LED “gives one the eerie feeling of being watched”.
  3. The “listener window” approach (in which a gazed-on window gains input focus) does not allow the user to type in one window while reading from another.

What is “Fitt's Law Relationship”?
What is “Brooks' Taxonomy”?