Human Computer Interaction
ERCIM News No.46, July 2001 [contents]

Interface Development Toolkits for Non-visual and Switch-based Interaction

by Constantine Stephanidis and Anthony Savidis

Commercial user interface development toolkits do not provide support for interaction techniques alternative to mouse/keyboard based graphical interaction. Two toolkits, the HAWK toolkit for non-visual interaction and the SCANLIB toolkit for switch-based interaction have been developed at ICS-FORTH.

The HAWK toolkit provides a set of standard non-visual interaction objects and interaction techniques that have been specifically designed to support high quality non-visual interaction. HAWK is appropriate for developing not only interfaces targeted to blind users, but also interfaces for a variety of situations in which dialogues not relying on visual communication and standard input devices are required (eg, driving, telephone-based applications, home control auditory interaction). A key notion in HAWK is that of a container interaction object. In the HAWK toolkit there is a single generic container class that does not provide any pre-designed interaction metaphor, but supplies appropriate presentation attributes through which alternative representations can be created. The container class has four attributes that enables each distinct container instance to be given a metaphoric substance by appropriately combining messages and sound-feedback (both names and sound effects can have a metaphoric meaning). In addition to generic containers, HAWK provides a comprehensive collection of conventional interaction objects directly supporting non-visual dialogue, namely menus (exclusive choice selector object), lists (multiple choice selector object), buttons (push button analogy for direct command execution), toggles (radio button analogy for on/off state control, edit fields (single line text input) and text reviewers (multi-line read-only text editor, with mark-up facilities).

HAWK also supports a variety of interaction techniques, namely synthesized speech, Braille (2-cell transitory Braille, 40-cell Braille), and digitised audio for output, and standard keyboard, joystick used for gestures independently of visual interaction, touch-tablet (for programmable commands via associated regions) and voice recognition for input. The HAWK toolkit provides all the programming features met in currently available toolkits, such as hierarchical object composition, dynamic instantiation, call back registration, and event handling. The navigation dialogue enables the blind user to move within the interface structure composed of organizations of containers and contained objects in an easy way, through the provision of multi-modal control facilities. For instance, visiting contained objects is possible through joystick-based gestures, voice commands, keyboard short cuts, or via pressing specific regions of the touch-tablet. Container objects may contain other container objects realizing different metaphoric representations, thus supporting fusion of different metaphors in the context of non-visual interactive applications.

The HAWK toolkit has been used in the development of the AVANTI browser (see ERCIM News no. 41), of the NAUTILUS information kiosk (Project Nautilus, funded by the Hellenic Ministry of Development) and of a non-visual digital library for the Hellenic Blind Association.

Switch-based Interaction
In the SCANLIB interface development toolkit, the basic Windows object library has been augmented with scanning interaction techniques. Interfaces implemented through SCANLIB directly support motor-impaired user access, as well as access in other situations in which the keyboard and mouse input devices can not be used. Apart from enabling intra-application interaction control (eg, having access via switches to all interface elements of any interactive application), SCANLIB also supports inter-application interaction control (eg, enabling users to move across different applications). In SCANLIB, basic object classes are classified into five categories, each requiring a different dialogue policy to be designed:

The scanning interaction techniques are based on two fundamental actions: SELECT, and NEXT. Depending on the type of switch equipment required, four scanning modes are supported:

Programming control is provided for the extra attributes introduced by the augmented version of Windows objects, thus enabling objects to differentiate scanning style depending on scanning mode (one of the five alternatives), time interval (when time scanning is supported), and highlighter presentation parameters.

The SCANLIB toolkit has been used in the development of the AVANTI browser (see ERCIM News no. 41), and of the GRAFIS word processor for people with motor impairments (see ERCIM News no. 38).

Please contact:
Constantine Stephanidis — ICS-FORTH
Tel: +30 81 391741