Grasp-Shell vs Gesture-Speech: A Comparison of Direct and Indirect Natural Interaction Techniques in Augmented Reality (2014)
Type of ContentConference Contributions - Other
PublisherUniversity of Canterbury. Human Interface Technology Laboratory
In order for natural interaction in Augmented Reality (AR) to become widely adopted, the techniques used need to be shown to support precise interaction, and the gestures used proven to be easy to understand and perform. Recent research has explored free-hand gesture interaction with AR interfaces, but there have been few formal evaluations conducted with such systems. In this paper we introduce and evaluate two natural interaction techniques: the free-hand gesture based Grasp-Shell, which provides direct physical manipulation of virtual content; and the multi-modal Gesture-Speech, which combines speech and gesture for indirect natural interaction. These techniques support object selection, 6 degree of freedom movement, uniform scaling, as well as physics-based interaction such as pushing and flinging. We conducted a study evaluating and comparing Grasp-Shell and Gesture-Speech for fundamental manipulation tasks. The results show that Grasp-Shell outperforms Gesture-Speech in both efficiency and user preference for translation and rotation tasks, while Gesture-Speech is better for uniform scaling. They could be good complementary interaction methods in a physics-enabled AR environment, as this combination potentially provides both control and interactivity in one interface. We conclude by discussing implications and future directions of this research.
CitationPiumsomboon, T., Altimira, D., Kim, H., Clark, A., Lee, G., Billinghurst, M. (2014) Grasp-Shell vs Gesture-Speech: A Comparison of Direct and Indirect Natural Interaction Techniques in Augmented Reality. Munich, Germany: 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 10-12 Sep 2014. 73-82.
This citation is automatically generated and may be unreliable. Use as a guide only.
KeywordsAugmented reality; natural interaction; multi-modal interface
ANZSRC Fields of Research08 - Information and Computing Sciences::0801 - Artificial Intelligence and Image Processing
08 - Information and Computing Sciences::0806 - Information Systems::080602 - Computer-Human Interaction
Showing items related by title, author, creator and subject.
Poster: Physically-Based Natural Hand and Tangible AR Interaction for Face-to-Face Collaboration on a Tabletop Piumsomboon, T.; Clark, A.; Umakatsu, A.; Billinghurst, Mark (University of Canterbury. Human Interface Technology Laboratory, 2012)In this paper, we present an AR framework that allows natural hand and tangible AR interaction for physically-based interaction and environment awareness to support face-to-face collaboration using Microsoft Kinect. Our ...
A Comparison of Surface and Motion User-Defined Gestures for Mobile Augmented Reality. Dong Z; Piumsomboon T; Zhang J; Clark AJ; Bai H; Lindeman RW (ACM, 2020)
Real-time Visual Representations for Mixed Reality Remote Collaboration Gao L; Bai H; Piumsomboon T; Lee G; Lindeman RW; Billinghurst, Mark (2017)We present a prototype Mixed Reality (MR) system with a hybrid interface to support remote collaboration between a local worker and a remote expert in a large-scale work space. By combining a low-resolution 3D point-cloud ...