ERCIM News No.46, July 2001 [contents]
by Julie A. Jacko
As part of a larger research agenda aimed at empowering all citizens access to electronic information, a research project at the Georgia Institute of Technology, USA, is focusing on empirically linking clinical diagnoses of visual dysfunction with human-computer interaction. The project is funded through 2004.
Funded by awards from the National Science Foundation (Presidential Early Career Award for Scientists and Engineers) and the Intel Corporation, the project is motivated by the knowledge that there is a critical need for all citizens to be empowered to access information electronically. Visual impairment remains a major impediment to electronic access of information. Moreover, in the United States alone, one in every six Americans by the age of 45 years will develop some type of uncorrectable visual impairment.
The research project investigates existing software technologies (ie, Windows and Windows-type platforms) by isolating specific features of computer interfaces like visual icon design, the use of background and foreground colors, and the construction of menu systems. Elements of the interaction such as a persons brain activity in the visual cortex and eye movement patterns on the computer screen are measured. Researchers have shown that visually impaired computer users perform visual search more slowly than their fully sighted counterparts. However, little is known about intermediate stages of visual search that exist between stimuli presentation and stimuli detection/identification. Therefore, the primary focus of the former research segment is to investigate two intermediate stages of visual search in visually impaired computer users: preattention and focal attention. This was accomplished through use of physiological measure-ments using electroencephalogram (EEG). Through the use of EEG, it has been evidenced that the additional time required by a visually impaired computer user to complete visual search is because of the extra time required for active search, once the visual cortex has already been engaged. Thus, a persons visual limitations are concentrated on the second stage of the process, focal attention.
Subsequent research, centered on focal attention, has involved investigations of eye movements. These investigations are ongoing and involve a remote-mounted infrared video eye gaze tracking system that is used in concert with software that enables isolating specific interaction scenarios during use of a graphical user interface. The eye gaze control unit and software record values of x and y coordinates for the participants point-of-gaze, at a rate of 60 per second. Such data enable characterizations of specific performance metrics such as fixations and saccades, both indicative of the strategies utilized during visual search and processing (see the figure for a photo taken during experimentation).
|Experimental paradigm involves tracking a human subjects eye movements (research led by J.A. Jacko, Ph.D., Principal Investigator).|
The advances that have already been made enable the establishment of a research basis for this field of inquiry, offering well-grounded, empirical findings that, in some cases, support intuition and, until today, ad hoc solutions. This has been accomplished by establishing groundwork for inquiries aimed at understanding how basic human-computer interaction is linked to the visual capabilities of the low vision user. The research is uniquely collaborative, involving collaboration with an ophthalmologist (Dr. Ingrid U. Scott) at the Bascom Palmer Eye Institute (BPEI) of the University of Miami School of Medicine, USA, in order to couple ophthalmologic expertise with her expertise in human-computer interaction. This collaboration also enables the engagement of people with impaired vision from the BPEI Low Vision Clinic in the research. Additional collaborators in this research include Dr. Armando B. Barreto of the Department of Electrical and Computer Engineering of Florida International University (FIU), USA. Dr. Barreto is Director of the Digital Signal Processing Laboratory at FIU.
To achieve universal access to electronic information technologies, designs must overcome barriers that have been perpetuated by traditional one-size-fits-all philosophies in order to accommodate the disabled, the elderly and technologically unsophisticated individuals. In order for information technologies to be universally accessible, there must be a paradigm shift in human-computer interaction (HCI) that shifts the burden of interpreting behavior from the human to the computer. An ever-growing population of users who will benefit tremendously from this paradigm shift is those with impaired vision.
To facilitate shifting the burden of interpreting behavior from the human to the computer, the notion of adaptive interfaces has emerged. Ongoing and future work of this research team involve developing methodologies and tools necessary to implement adaptive, multimodal human-computer interfaces that are personalized for individual users representing a full spectrum of visual capabilities.
Julie A. Jacko Georgia Institute of Technology, USA
Tel: +1 404 894 2342