ERCIM News No.47, October 2001 [contents]

Ambient Interfaces for Distributed Work Groups

by Tom Gross

The Computer-Supported Cooperative Work group of the Fraunhofer Institute for Applied IT develops ambient interfaces bridging the gap between the physical and the electronic world and supporting distributed work groups with a shared environment of mutual information and orientation providing a basis for smooth coordination.

Knowledge workers typically work in both the physical world with physical artefacts and in the electronic world with electronic artefacts. At the same time they work at various places and cooperate with colleagues at the same or at a distant places. So, as the knowledge workers’ activities and plans span across media, locations and work groups, they need to spend a considerable amount of time and energy for orientation and coordination.

We are developing a hybrid setting that bridges the gap between physical and electronic worlds and provides a framework for orientation in both worlds. In this hybrid setting, ambient interfaces are the basis for a shared environment of mutual information fostering intuitive and adequate behaviour of distributed participants in a cooperative setting. The ambient interfaces act as sensors in the physical world and capture various types of information such as the position and orientation of a user, noise in a coffee room or movement in the hallway. On the other hand they act as indicators presenting digital information in the physical environment of the user by means of projections of digital information on the wall and physical artefacts such as fish tanks, propellers, or robots (see Figure).

In a first generation of ambient interfaces the sensors were mainly developed based on existing technology. The MOVY system was used to capture the position and orientation of a user (see ERCIM News No. 36, January 1999); a WebCam with microphones and standard software was used to capture noise; and the ‘Vision Command’ from LEGO was used to capture movement. Some simple binary ambient interfaces, which can be switched on and off, were developed to present the information. Relaisboards with eight relais allowed to control eight binary ambient interfaces. For instance, the fish tank starts releasing bubbles, or a propeller starts blowing air into the user’s face when certain events occur. Depending on the frequency of the releases of the bubbles, the bubbling starts to aggregate, ie, when interesting events occur in high frequency, there are still bubbles from the former releases and the bubbling on a whole gets rougher indicating that a lot is going on. Additionally, the release of the bubbles produces subtle sounds of the motor and the bubbles. Thus the user can capture the information visually and acoustically. The propeller addresses the haptic sense of the user by blowing air in a subtle way into the user’s face.

User with TOWER world; fish tank; propeller; AwareBot. User with TOWER world; fish tank; propeller; AwareBot.

This first generation was deployed in users’ offices and in a public coffee room. From the users’ feedback three mayor requirements for the second generation of ambient interfaces were derived: customisability (ie, the developers and users should be able to easily and rapidly build and adapt ambient interfaces); aesthetics (ie, the users should like and appreciate the presence of ambient interfaces); and low cost (ie, it should be possible to deploy big numbers of ambient interfaces to individual offices and public places in office buildings).

In the second generation we developed AwareBots, ambient interfaces presenting awareness information with the gestalt of robots. Based on LEGO’s Mindstorms Robotics Invention System, a broad variety of AwareBots were designed and created by project members. For instance, the AwareBot in the right picture of the figure can indicate the login of a colleague by lifting its hat and the manipulation of a document by rotating its body. In order to log into the system and to set the status to ‘Available’, the user simply presses the arm of the AwareBot. Other AwareBots can roll their eyes when the user’s Web pages are accessed and capture movement outside of the owner’s office. Additionally, a WAP interface was developed that allows mobile users to query for information and enter information manually while on the move.

For the design of the third generation the existing designs will be evaluated and possibly consolidated. As an overall output we plan to come up with basic guidelines for the design of ambient interfaces capturing and presenting information in the user’s physical environment in order to provide shared guidance. The ambient interfaces described are based on the TOWER system and its event and notification infrastructure (cf. ERCIM News No. 42, July 2000). They are being developed in the IST-10846 project TOWER, partly funded by the EU, with the partners Aixonix, blaxxun interactive AG, FhG, UCL, and WS Attkins.


Please contact:
Tom Gross - Fraunhofer Institute for Applied Information Technology
Tel: +49 2241 14 2091
E-mail: tom.gross@fit.fraunhofer.de