Title: Grid ttolkit
1Human-Computer Interface for Person with Severe
Motor Disability Anaelis Sesin Advisor Malek
Adjouadi Center for Advanced Technology and
Education Florida International University
Abstract The objective of this study is to
develop an adaptive real-time human computer
interface (HCI) that serves as an assistive
technology tool for people with severe motor
disability. The proposed HCI design employs eye
gazing as the primary computer input device.
Regrettably, controlling the mouse cursor using
raw eye coordinates results in sporadic motion of
the pointer due to the saccade nature of the eye.
Even though, eye movements are subtle and
completely imperceptible under normal
circumstances, they affect considerably the
accuracy of an eye gaze-based HCI. The novelty of
the proposed HCI system is that it adapts to the
different and potentially changing jitter
characteristics of each specific user, through
the configuration and training of an artificial
neural network (ANN) for minimizing the mouse
jitter. This task is based on feeding the ANN
initially-recorded user eye-gazing behavior. The
ANN finds the relationship between the gaze
coordinates and the mouse cursor position based
on the multilayer perceptron model. An embedded
graphical interface is used for training session
and generating user profiles that make up these
unique configurations. The results show an
average jitter reduction of 35, which is
significant in relation to the ease of use of the
EGT system. Objective this Study The objective
of this research endeavor is to develop an eye
gaze-based HCI system that accommodates and
adapts to different users through design
customization and configuration. Generally
speaking, the methodology relies on a user
profile that customizes eye gaze tracking using
artificial neural networks (ANN). The user
profiling aspects facilitate universal access to
computing resources and, in particular, enable an
adaptable interface for a wide range of
individuals having severe motor
disabilities. HCI System Configuration The HCI
system as proposed is based on a benchtop eye
gaze setup. It consists of a CPU for raw eye
movement data acquisition, a CPU for user
interaction, an eye monitor, a scene monitor, an
eye imaging camera, and an infrared light source.
The EGT system used in this study is the ISCAN
ETL-500 as show in Figure 1.
Experimental Results Experiment 1 Jittering
Reduction and Adaptation of the
System The results reveal an
average of 35.1 reduction in jittering error
when the EGT is supported with ANN intervention,
which represents a substantial improvement in the
use of eye gaze to control the mouse pointer.
Furthermore, the results prove that as a user
profile is edited and the ANN is retrained, for
the same user, the system learns even further how
to overcome the jitter behavior for that
particular user. If the initial degree of jitter,
before training the system and without any ANN
intervention (of first trial), and the final
degree of jitter, after training the system
several times and with ANN intervention ( of last
trial), are compared for both subjects, the
results reflect a 75.9 and 85.4 reduction in
jittering error for subject 1 and 2 respectively
after. Experiment 2 Graphical Visualization of
Jittering Reduction Without
ANN With ANN Acknowledgements The
authors are grateful for the support provided by
the National Science Foundation under grants
HRD-0317692, CNS-0426125, the BPC program under
CNS-0540592, and the NSF graduate fellowship
program. Additional infrastructure support from
NSF grants CNS-0520811 and IIS-0308155 were very
helpful in complementing this study.
Table 1- Average jitter reduction for all users
Tables 2 and 3- Jitter Reduction for two subjects
1 and 2 after 3 trials each
Figure 3- Graphics of jitter reduction for the
two subjects 1 and 2 after 3 trials each
Figure 1- The EGT Interface and its use with and
without the headrest
- Implementation Steps
- Analysis of the mouse cursor trajectory without
ANN intervention. - Define a suitable configuration of the ANN.
- Acquisition of data and extraction of training
patterns for training the ANN. - Training of the ANN and saving the results (i.e.
weights and biases) as a user profile. - In step (1), the mouse cursor trajectory was
subdivided into smaller sections, which allowed
the trajectory of the mouse to be described
linearly. Each subset contains 6 (x,y) points
generated during a 1/10 of a second and is used
as input to the ANN. Therefore, the ANN contains
12 (6 x-ordinates 6 y-ordinates) neurons in the
input layer. - The ANN configuration was defined as 12-24-2
using the backpropagation algorithm and default
activation functions as follows (1) Linear for
the input layer, (2) Logsig for the hidden layer,
and (3) Linear for the output layer as shown in
Figure 2.
Figure 4- Visualization of the jitter reduction
under different tests
References 1. Hong P, Huang T. Natural Mouse A
novel human computer interface. IEEE
International Conference on Image Processing,
Oct. 24-28, 1999 Kobe, Japan. 2. Hutchinson TE,
White Jr. KP, Reichert KC, Frey LA.
Human-computer interaction using eye gaze input.
IEEE Trans. on Systems, Man, and Cybernetics, 19
1527-1533 , 1989. 3. Sesin A, Adjouadi M. A
calibrated real-time eye gaze tracking system as
an assistive system for persons with motor
disability. Proc. of The 7th World
Multiconference on Systemics, Cybernetics and
Informatics, Orlando, FL. July 27-30, 2003 Vol.
VI 399-404. 4. Barreto A, Scargle S, and
Adjouadi M. A real-time assistive computer
interface for users with motor disabilities. ACM
SIGCAPH Computers and the Physically Handicapped,
Issue 64, June 1999 p. 6-16. 5. Duchowski AT.
Eye tracking methodology theory and practice.
Springer 2003. 6. Glenstrup AJ, and
Engell-Nielsen T. Eye controlled media present
and future state. University of Copenhagen,
Denmark, 1995. http//www.diku.dk/panic/eyegaze 7
. Barreto A, Scargle S, and Adjouadi M. A
practical EGM-based human-computer interface for
users with motor disabilities. Journal of
Rehabilitation Research and Development, Vol. 37,
No. 1, Jan-Feb 2000 p. 53-63. 8. Chin CA,
Barreto A, and Adjouadi M. Enhanced real-time
cursor control algorithm, based on the spectral
analysis of electromyograms. Biomedical Sciences
Instrumentation, 2006, Vol. 42, pp. 249-254. 9.
Chin CA, Barreto A, and Alonso M.
Electromyogram-based cursor control system for
users with motor disabilities. Lecture Notes on
Computer Science, K. Miesenberger et al. Eds.
ICCHP-2006, LNCS 4061, pp. 905-912, 2006. 10.
ISCAN INC., Raw eye movement data acquisition
software, instruction manual. 11. Baluja S, and
Pomerleau D, Non-intrusive gaze tracking using
artificial neural networks. Advances in Neural
Information Processing Systems (NIPS) 6,
1994. 14. Sesin A, Adjouadi M, Ayala, Barreto A,
Rishe N. A real-time vision based human computer
interface as an assistive technology for persons
with motor disability. WSEASTransactions on
Computer Research, Issue 2, Vol. 2, Feb. 2007
ISSN 1991-8755, p. 115-121.
Figure 2- The ANN Configuration