W3C Seminar on Multimodal Web Applications for Embedded Systems
W3C is developing standards that support multiple modes of interaction: aural, visual and tactile. The Web then becomes accessible by using voice or hands via a key pad, keyboard, mouse or stylus. One can also listen to spoken prompts and audio, and view information on graphical displays.
The multimodal Web transforms the way how people interact with applications:
- In your hand: portable access to multimedia communication, news and entertainment services
- In your car: integrated dashboard system offering hands free navigation and infotainment services
- In your home: remote control of your everyday appliances, including television, video recorder, fridge, etc.
- In your office: choose how you interact with your computer, using a pen, keyboard or spoken commands.
W3C wishes to bring Web technologies to new environments such as mobile devices, automotive telematics and ambient intelligence. Already, many innovative multimodal Web applications have been developed, some of which will be showcased at the W3C seminar on multimodal Web applications for embedded systems, in Toulouse, 21 June 2005.
This seminar is funded by the Multimodal Web Interaction (MWeb) project, financed by the European Commission's FP6 IST Programme (unit INFSO-E1: Interfaces). Attendance to the seminar is free and open to the public.
Link:
Toulouse MWeb seminar page: http://www.w3.org/2005/03/MWeb-seminar.html