Adaptive Eye-Gaze Tracking Using Neural-Network-Based User Profiles to Assist People With Motor DisabilityBy Sesin, Anaelis; Adjuouadi, Malek; Cabrerizo, Mercedes; Ayala, Melvin; Barreto, Armando; Journal of Rehabilitation Research and Development, Vol. 45, No. 6, pp. 801-818
Publication Date: 2008
Article describes the development of an adaptive real-time human-computer interface (HCI) that serves as an assistive-technology tool for people with severe motor disability. The HCI uses eye gaze as the primary computer-input device. Because of the abrupt-moving nature of the eye, controlling the mouse cursor with raw eye coordinates results in sporadic motion of the pointer. This HCI system adapts to each specific user’s different jitter characteristics through the configuration and training of an artificial neural network (ANN) that is structured to minimize the mouse jitter. This works by feeding the ANN a user’s initially recorded eye-gaze behavior through a 5-minute training session, during which an embedded graphical interface generates user profiles that make up these unique ANN configurations. The interface was evaluated with two tests: Test 1 with 12 participants, which involved following a moving target; and Test 2, where 9 participants followed the contour of a square object. Results showed an average jitter reduction of 35 percent for Test 1, and 53 percent for Test 2. For both results, the outcomes led to trajectories that were significantly smoother and apt at reaching fixed or moving targets with relative ease. Results of a third test, with 6 participants clicking a stationary button for one minute, revealed a 176-percent improvement of click efficiency with significantly reduced target-selection time with the assistance of personalized ANN.
VA Rehabilitation Research & Development Service (Web Site: http://www.rehab.research.va.gov )
Link to text: http://www.rehab.research.va.gov/jour/08/45/6/sesin.html
This publication is included in the library of the National Rehabilitation Information Center (NARIC), accession number J55442