A multipurpose test-bed for integrating user interface and sensor technologies has been developed, based on a client- server architecture. Various interaction modalities (Speech recognition, 3-D Audio, Pointing, wireless Handheld- PC-based control and interaction, sensor interaction, etc.) are implemented as servers, encapsulating and exposing commercial and research software packages. The system allows for integrated user interaction with large and small displays using speech commands combined with pointing, spatialized audio, and other modalities. Simultaneous and independent speech recognition for two users is supported; users may be equipped with conventional acoustic or new body-coupled microphones.
Multimodal HCI Integration
Sae Technical Papers
World Aviation Congress & Exposition ; 1999
19.10.1999
Aufsatz (Konferenz)
Englisch
British Library Conference Proceedings | 1999
|Intelligent Human Tracking Based on Multimodal Integration
British Library Online Contents | 2012
|Multimodal integration in passengers transportation: a user-oriented development
British Library Conference Proceedings | 1994
|Integration of Intermodal and Multimodal Considerations into the Planning Process
British Library Conference Proceedings | 2000
|