Cover ERCIM News 57

This issue in pdf
(72 pages; 11,6 Mb)



Archive:
Cover ERCIM News 56
previous issue:
Number 56
January 2004:
Special theme:
Analysis, Planning, Diagnosis and Simulation of Industrial Systems

all previous issues


Next issue:
July 2004

Next Special theme:
Automated Software Engineering


About ERCIM News


Valid HTML 4.01!

spacer
 
< Contents ERCIM News No. 57, April 2004
SPECIAL THEME
 

An Intuitive Game in an Intelligent Ubiquitous Environment

by Jaana Leikas, Hanna Strömberg, Antti Väätänen and Luc Cluitmans


A new group game, developed for an intelligent environment by VTT Information Technology, explores people's movements when they interact with the application and with each other in the game. The user interface blends in with the surroundings, utilises human senses diversely and provides a wealth of experiences.

Conventional virtual reality games try to make the user interface transparent or invisible through the use of head-mounted displays and data gloves. We have succeeded in creating a new kind of group game that involves not only human-computer interaction, but also human-human interaction in the interactive virtual space. In our game, Nautilus, the view chosen by the player is a step forward in game design. The solution offers players a highly personal experience through controlling the game with their own body movements. The user interface blends in with the surroundings and integrates the player's movements into the game, thus weakening the boundaries between the room and the interactive virtual space. In the game, the players do not wear virtual reality devices.

Designing the Intelligent Environment
The Nautilus game is designed for location-based entertainment (LBE) such as amusement parks. It is expected to last only five minutes in order to enable a large number of groups to experience it each day, and is designed for small groups of players, aged between 8-13 years, with no previous experience of computer games.

The environment is based on intelligent movement sensing. The metaphor for controlling the user interface is simple and the interaction is familiar to everyone: a pressure-sensitive floor allows the users to control a virtual vehicle (a submarine) both by making movements with their body and by moving in different directions in physical space. The intelligent game environment combines the pressure-sensitive floor system, a real-time 3D-graphics engine and special-effects devices within a teamwork application.

The intelligent environment allows several players to move freely on the floor at the same time and to work as a group. The players do not have any predetermined roles in the game or specific positions in the room. They do not need to have any special clothing or virtual reality devices. Moving as a group in different directions on the floor moves the vehicle in corresponding directions in the virtual world. Waving arms up and down or jumping rapidly makes the submarine ascend and standing still makes the submarine descend. These movements allow the players to move in any direction in the virtual world.

The development work relied heavily on a User-Centred Design (UCD) approach. The scenarios created during the design characterised the players, the desired activities, events and effects of the virtual world, as well as the intelligent environment for intuitive movement sensing. The design solutions were evaluated with users at every step of the iterative process. The methods used in the evaluation were qualitative and included interviews, observation and video recordings. Also, usability experts carried out usability inspections of the prototypes during different phases of the design. In addition, a test game was created and tested with end-users to verify the functionality and relevance of the floor system and the user interface of the application.

Natural and Intuitive User Interface
Through the user interface, users can interact intuitively and naturally with the environment and with each other. The interface solution comprises a pressure-sensitive floor system, a real-time 3D-graphics engine, and special-effects devices (sounds and lights). The system is based on 49 pressure-sensitive floor tiles and one host computer (a PC, which also runs the application software). Each of the floor tiles contains a microcontroller board and four sensors (one at each corner), giving 196 sensors in total.

Figure 1: The intelligent environment for intuitive movement sensing. In this environment it is possible to test human-machine as well as human-human interaction in a group by utilising movement detection.
Figure 1: The intelligent environment for intuitive movement sensing. In this environment it is possible to test human-machine as well as human-human interaction in a group by utilising movement detection.
Figure 2: Observing a group of children playing the game.
Figure 2: Observing a group of children playing the game.

Since the tile controllers do not take any action unless explicitly requested by the host, the host is responsible for timing the sampling. The driver software that is responsible for retrieving the data from the controllers has several tasks. First, it should fulfil its primary task: retrieving the sensor data and making that data available for other software components running on the host PC. There are several additional issues that require special care, however: accurate timing, controller-malfunction checking, timeout checking, preventing different controllers from sending data simultaneously, defining an API that defines how programs can access the driver and so on.

Along with the rest of the system, a program was developed that displays a map of the floor. The map allows one to track people moving on the floor. The map shows each sensor re-presented by a coloured square, where the colour indicates the pressure change measured by the sensor.

Figure 3: Visualisation of floor tile activity.
Figure 3: Visualisation of floor tile activity.

Future Work
The intelligent environment developed is suitable for testing applications that support interactions in a group. Compared with other methods of human-machine interaction, this intelligent movement-sensing environment allows the user to use his or her whole body in an original but natural way. This kind of user interface is a step forward in the challenging work of creating systems and applications with hidden user interfaces.

The environment will be further developed and tested with different applications by human-factors experts at VTT Information Technology's virtual reality studio. The environment will be connected to both virtual reality and ubiquitous computing design ideas that aim to create an interactive and natural environment with a shared player experience. Several business application opportunities can be envisaged in this area, such as solutions for intuitive learning and evaluation of teamwork capabilities.

Please contact:
Jaana Leikas, VTT Information Technology
E-mail: jaana.leikas@vtt.fi

 

spacer