Hanheide, Marc and Bauckhage, Christian and Sagerer, Gerhard (2005) Combining environmental cues & head gestures to interact with wearable devices. In: 7th international conference on Multimodal interfaces, October 4ï¿½6 2005, Trento, Italy..
|Item Type:||Conference or Workshop Item (Paper)|
Hanheide2005-Combining_environmental_cues__head_gestures_to_interact_with_wearable_devices.pdf - Whole Document
Restricted to Repository staff only
|Divisions:||College of Science > School of Computer Science|
|Abstract:||As wearable sensors and computing hardware are becoming a reality, new and unorthodox approaches to seamless human-computer interaction can be explored. This paper presents the prototype of a wearable, head-mounted device for advanced human-machine interaction that integrates speech recognition and computer vision with head gesture analysis based on inertial sensor data. We will focus on the innovative idea of integrating visual and inertial data processing for interaction. Fusing head gestures with results from visual analysis of the environment provides rich vocabularies for human-machine communication because it renders the environment into an interface: if objects or items in the surroundings are being associated with system activities, head gestures can trigger commands if the corresponding object is being looked at. We will explain the algorithmic approaches applied in our prototype and present experiments that highlight its potential for assistive technology. Apart from pointing out a new direction for seamless interaction in general, our approach provides a new and easy to use interface for disabled and paralyzed users in particular.|
|Date Deposited:||30 Nov 2012 10:53|
Actions (login required)