Cover ERCIM News 61

This issue in pdf
(64 pages; 10,4 Mb)


Cover ERCIM News 60
previous issue:
Number 60
January 2005:
Special theme:
Biomedical Informatics

previous issues online

Next issue:
July 2005

Next Special theme:
Multimedia Informatics

Call for the next issue

About ERCIM News

< Contents ERCIM News No. 61, April 2005

W3C Seminar on Multimodal Web Applications for Embedded Systems

W3C is developing standards that support multiple modes of interaction: aural, visual and tactile. The Web then becomes accessible by using voice or hands via a key pad, keyboard, mouse or stylus. One can also listen to spoken prompts and audio, and view information on graphical displays.

The multimodal Web transforms the way how people interact with applications:

  • In your hand: portable access to multimedia communication, news and entertainment services
  • In your car: integrated dashboard system offering hands free navigation and infotainment services
  • In your home: remote control of your everyday appliances, including television, video recorder, fridge, etc.
  • In your office: choose how you interact with your computer, using a pen, keyboard or spoken commands.

W3C wishes to bring Web technologies to new environments such as mobile devices, automotive telematics and ambient intelligence. Already, many innovative multimodal Web applications have been developed, some of which will be showcased at the W3C seminar on multimodal Web applications for embedded systems, in Toulouse, 21 June 2005.

This seminar is funded by the Multimodal Web Interaction (MWeb) project, financed by the European Commission's FP6 IST Programme (unit INFSO-E1: Interfaces). Attendance to the seminar is free and open to the public.

Toulouse MWeb seminar page: