"Move the Couch Where?": Developing an Augmented Reality Multimodal Interface (2006)
Type of ContentConference Contributions - Published
PublisherUniversity of Canterbury. Human Interface Technology Laboratory.
This paper describes an augmented reality (AR) multimodal interface that uses speech and paddle gestures for interaction. The application allows users to intuitively arrange virtual furniture in a virtual room using a combination of speech and gestures from a real paddle. Unlike other multimodal AR applications, the multimodal fusion is based on the combination of time-based and semantic techniques to disambiguate a users speech and gesture input. We describe our AR multimodal interface architecture and discuss how the multimodal inputs are semantically integrated into a single interpretation by considering the input time stamps, the object properties, and the user context.
CitationIrawati, S., Green, S., Billinghurst, M., Duenser, A., Ko, H. (2006) "Move the Couch Where?": Developing an Augmented Reality Multimodal Interface. Santa Barbara, CA, USA: 5th IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR 06), 22-25 Oct 2006. Proceedings ISMAR International Symposium on Mixed and Augmented Reality, 183-186.
This citation is automatically generated and may be unreliable. Use as a guide only.
Keywordsmultimodal interaction; paddle gestures; augmented reality; speech input; gesture input
Showing items related by title, author, creator and subject.
Irawati, S.; Green, S.; Billinghurst, Mark; Duenser, A.; Ko, H. (University of Canterbury. Human Interface Technology Laboratory., 2006)This paper discusses an evaluation of an augmented reality (AR) multimodal interface that uses combined speech and paddle gestures for interaction with virtual objects in the real world. We briefly describe our AR ...
Lee, Minkyung (University of Canterbury. Department of Computer Science and Software Engineering, 2010)Augmented Reality (AR) has the possibility of interacting with virtual objects and real objects at the same time since it combines the real world with computer-generated contents seamlessly. However, most AR interface ...
Freeze view touch and finger gesture based interaction methods for handheld augmented reality interfaces Bai, H.; Lee, G.A.; Billinghurst, Mark (University of Canterbury. Human Interface Technology Laboratory, 2012)Interaction techniques for handheld mobile Augmented Reality (AR) often focus on device-centric methods based around touch input. However, users may not be able to easily interact with virtual objects in mobile AR scenes ...