Imagine yourself being a intelligent, motivated, and working person in the fiercely competitive market of information technology, but just one problem You can't use your hands. Or you can't speak. How do you do your job? How do you stay employed? You can, because of a very good gift from computer Industry : The Eyegaze, a communication & control system you run with your eyes. The Eyegaze System is a direct-select vision-controlled communication and control system. It was developed in Fairfax, Virginia, by LC Technologies, Inc.,
Who's using the Eyegaze System?
This system is mainly developed for those who lack the use of their hands or voice. Only requirements to operate the Eyegaze are control of at least one eye with good vision & ability to keep head fairly still. Eyegaze Systems are in use around the world. Its users are adults and children with cerebral palsy, spinal cord injuries, brain injuries, ALS, multiple sclerosis, brainstem strokes, muscular dystrophy, and Werdnig‑Hoffman syndrome. Eyegaze Systems are being used in homes, offices, schools, hospitals, and long‑term care facilities. By looking at control keys displayed on a screen, a person can synthesize speech, control his environment (lights, appliances, etc.), type, operate a telephone, run computer software, operate a computer mouse, and access the Internet and e-mail. Eyegaze Systems are being used to write books, attend school and enhance the quality of life of people with disabilities all over the world.
How does the Eyegaze System work?
As a user sits in front of the Eyegaze monitor, a specialized video camera mounted below the monitor observes one of the user's eyes. Sophisticated image‑ processing software in the Eyegaze System's computer continually analyzes the video image of the eye and determines where the user is looking on the screen. Nothing is attached to the user's head or body.
In detail the procedure can be described as follows: The Eyegaze System uses the pupil-center/corneal-reflection method to determine where the user is looking on the screen. An infrared-sensitive video camera, mounted beneath the System's monitor, takes 60 pictures per second of the user's eye. A low power, infrared light emitting diode (LED), mounted in the center of the camera's lens illuminates the eye. The LED reflects a small bit of light off the surface of the eye's cornea. The light also shines through the pupil and reflects off of the retina, the back surface of the eye, and causes the pupil to appear white. The bright-pupil effect enhances the camera's image of the pupil and makes it easier for the image processing functions to locate the center of the pupil. The computer calculates the person's gazepoint, i.e., the coordinates of where he is looking on the screen, based on the relative positions of the pupil center and corneal reflection within the video image of the eye. Typically the Eyegaze System predicts the gazepoint with an average accuracy of a quarter inch or better.
Prior to operating the eyetracking applications, the Eyegaze System must learn several physiological properties of a user's eye in order to be able to project his gazepoint accurately. The system learns these properties by performing a calibration procedure. The user calibrates the system by fixing his gaze on a small yellow circle displayed on the screen, and following it as it moves around the screen. The calibration procedure usually takes about 15 seconds, and the user does not need to recalibrate if he moves away from the Eyegaze System and returns later.
Today, the human eye-gaze can be recorded by relatively unremarkable techniques. This thesis argues that it is possible to use the eye-gaze of a computer user in the interface to aid the control of the application. Care must be taken, though, that eye-gaze tracking data is used in a sensible way, since the nature of human eye-movements is a combination of several voluntary and involuntary cognitive processes.
The main reason for eye-gaze based user interfaces being attractive is that the direction of the eye-gaze can express the interests of the user-it is a potential porthole into the current cognitive processes-and communication through the direction of the eyes is faster than any other mode of human communication. It is argued that eye-gaze tracking data is best used in multimedia interfaces where the user interacts with the data instead of the interface, in so-called non-command user interfaces.