Wearable Systems for Everyday Use
by Spyros Lalis, Anthony Savidis and Constantine Stephanidis
You have just landed in Heraklion. As you leave the airport you are handed a CityPass, a small gadget hosting a map application for the city, which you touch with your ID-Key and put away in your pocket. A few seconds later your wristwatch beeps with a message: "Welcome to Heraklion - the MapGuide can be controlled using your PDA or your wristwatch".
As you stroll downtown, you shoot several pictures, which are tagged with time and location information retrieved from the GPS on your wristwatch. Storage in the camera begins to run out, and the camera informs you that photos are now being stored in the Assistant in your backpack. At the same time, the Assistant silently uploads the photos to your home repository whenever you pass near a public network access point. You meet a friend at a café and together you browse through the photo collection. You decide to have dinner at an old tavern displayed in one of the pictures. You pull the Notepad out of the backpack and use the CityMap application to display the location of the tavern, using location information from the photo coordinates and your current position provided by the GPS on your wristwatch.
This brief scenario gives a flavour of the 2WEAR project and the vision shared by FORTH, ETHZ (Switzerland), MA Systems & Control (UK), and NOKIA/NRC (Finland) of the future personal computer put together as an agglomerate of small, wearable and physically distributed devices. 2WEAR is a three-year EC-funded project that started on January 1, 2001 with the objective to build such a system, both in terms of software architecture and devices that communicate using short-range radio technology. The main characteristic of the envisioned system is its dynamic extensibility and adaptation, which would allow users to combine devices in a flexible manner according to their current needs, and to exploit the potential of surrounding infrastructure within buildings, vehicles, etc. The 2WEAR system is exploring the following directions in parallel.
Core Runtime Mechanisms
In this new paradigm, the personal computer, rather than being fixed in a box with a well-known amount of hardware resources and a given number of peripherals attached to it, is actually a system that is dynamically composed by bringing together different devices. To convert this physical versatility into tangible flexibility for the application and user, several problems must be addressed at the system level. Besides being a provider and manager of local resources, the runtime now becomes responsible for detecting and exploiting resources that become available via other devices that are added to the system. It must also take corrective actions should such resources become unavailable, due to failures or the user removing devices from the system.
The main system aspects that are being researched to achieve the desired functionality are ad-hoc discovery, flexible remote communication in the form of dialogue channels, and adaptive system services. Adaptive system services correspond to high-level resource abstractions provided to the application programmer, while dealing with low-level runtime issues such as resource discovery, remote resource access, resource switching and compensation in the case of resource loss.
In dynamic, transient environments, the main architectural objective is to maintain high-quality interaction between users and applications. This principle of continuity emphasises the uninterrupted sequence of dialogue activities and ensures that the human-computer interaction dialogues transform gracefully from the user's perspective. To that end, the User Interaction Framework specifies: (a) how applications communicate with the user, and (b) the mechanisms allowing for the abstraction of Input Output (I/O) devices as well as for concurrently running applications to share I/O hardware resources. In practice, this means that framework services specific to the human-computer interaction dynamically allocate 'pooled' system resources to user activities according to user preferences.
These mechanisms operate at a level between applications and the runtime, defining an infrastructure of services that take over all decisions relating to the administration of I/O resources and their allocation to active applications. As a result, application developers are able to produce user interfaces that remotely utilise multiple I/O resources (eg audio devices, different types of displays, input devices) over a wireless communication medium. In cases where loss of connection with some I/O resources or discovery of new I/O resources may occur as a result of mobility, running interfaces dynamically employ such new I/O resources and/or possibly reallocate existing ones to ensure interaction continuity. In addition, applications continuously inform the user of changes in the interactive space, such as the detection of newly available computing units and/or the loss of connection with devices. In this way, the user can engage or disengage computing units on a task-oriented basis.
Project Web Site: http://2wear.ics.forth.gr
Spyros Lalis, ICS-FORTH
Tel: +30 2810 391746