An Evaluation of an Augmented Reality Multimodal Interface Using Speech and Paddle Gestures (2006)
Type of ContentConference Contributions - Published
PublisherUniversity of Canterbury. Human Interface Technology Laboratory.
This paper discusses an evaluation of an augmented reality (AR) multimodal interface that uses combined speech and paddle gestures for interaction with virtual objects in the real world. We briefly describe our AR multimodal interface architecture and multimodal fusion strategies that are based on the combination of time-based and domain semantics. Then, we present the results from a user study comparing using multimodal input to using gesture input alone. The results show that a combination of speech and paddle gestures improves the efficiency of user interaction. Finally, we describe some design recommendations for developing other multimodal AR interfaces.
CitationIrawati, S., Green, S., Billinghurst, M., Duenser, A., Ko, H. (2006) An Evaluation of an Augmented Reality Multimodal Interface Using Speech and Paddle Gestures. Hangzhou, China: 16th International Conference on Artificial Reality and Telexistence (ICAT 2006), 29 Nov-2 Dec 2006. Lecture Notes in Computer Science (LNCS), 4282, Advances in Artificial Reality and Tele-Existence, 272-283.
This citation is automatically generated and may be unreliable. Use as a guide only.
Keywordsmultimodal interaction; paddles gestures; augmented-reality; speech input; gesture input; evaluation
Showing items related by title, author, creator and subject.
"Move the Couch Where?": Developing an Augmented Reality Multimodal Interface Irawati, S.; Green, S.; Billinghurst, Mark; Duenser, A.; Ko, H. (University of Canterbury. Human Interface Technology Laboratory., 2006)This paper describes an augmented reality (AR) multimodal interface that uses speech and paddle gestures for interaction. The application allows users to intuitively arrange virtual furniture in a virtual room using a ...
Freeze view touch and finger gesture based interaction methods for handheld augmented reality interfaces Bai, H.; Lee, G.A.; Billinghurst, Mark (University of Canterbury. Human Interface Technology Laboratory, 2012)Interaction techniques for handheld mobile Augmented Reality (AR) often focus on device-centric methods based around touch input. However, users may not be able to easily interact with virtual objects in mobile AR scenes ...
Grasp-Shell vs Gesture-Speech: A Comparison of Direct and Indirect Natural Interaction Techniques in Augmented Reality Piumsomboon, T.; Altimira, D.; Kim, H.; Clark, A.; Lee, G.; Billinghurst, Mark (University of Canterbury. Human Interface Technology Laboratory, 2014)In order for natural interaction in Augmented Reality (AR) to become widely adopted, the techniques used need to be shown to support precise interaction, and the gestures used proven to be easy to understand and perform. ...