University of Canterbury Home
    • Admin
    UC Research Repository
    UC Library
    JavaScript is disabled for your browser. Some features of this site may not work without it.
    View Item 
    1. UC Home
    2. Library
    3. UC Research Repository
    4. Faculty of Engineering | Te Kaupeka Pūhanga
    5. Engineering: Conference Contributions
    6. View Item
    1. UC Home
    2.  > 
    3. Library
    4.  > 
    5. UC Research Repository
    6.  > 
    7. Faculty of Engineering | Te Kaupeka Pūhanga
    8.  > 
    9. Engineering: Conference Contributions
    10.  > 
    11. View Item

    Grasp-Shell vs Gesture-Speech: A Comparison of Direct and Indirect Natural Interaction Techniques in Augmented Reality (2014)

    Thumbnail
    View/Open
    12652683_paper138-cr.pdf (1.175Mb)
    Type of Content
    Conference Contributions - Other
    UC Permalink
    http://hdl.handle.net/10092/11090
    
    Publisher's DOI/URI
    https://doi.org/10.1109/ISMAR.2014.6948411
    
    Publisher
    University of Canterbury. Human Interface Technology Laboratory
    Collections
    • Engineering: Conference Contributions [2341]
    Authors
    Piumsomboon, T.
    Altimira, D.
    Kim, H.
    Clark, A.
    Lee, G.
    Billinghurst, Mark cc
    show all
    Abstract

    In order for natural interaction in Augmented Reality (AR) to become widely adopted, the techniques used need to be shown to support precise interaction, and the gestures used proven to be easy to understand and perform. Recent research has explored free-hand gesture interaction with AR interfaces, but there have been few formal evaluations conducted with such systems. In this paper we introduce and evaluate two natural interaction techniques: the free-hand gesture based Grasp-Shell, which provides direct physical manipulation of virtual content; and the multi-modal Gesture-Speech, which combines speech and gesture for indirect natural interaction. These techniques support object selection, 6 degree of freedom movement, uniform scaling, as well as physics-based interaction such as pushing and flinging. We conducted a study evaluating and comparing Grasp-Shell and Gesture-Speech for fundamental manipulation tasks. The results show that Grasp-Shell outperforms Gesture-Speech in both efficiency and user preference for translation and rotation tasks, while Gesture-Speech is better for uniform scaling. They could be good complementary interaction methods in a physics-enabled AR environment, as this combination potentially provides both control and interactivity in one interface. We conclude by discussing implications and future directions of this research.

    Citation
    Piumsomboon, T., Altimira, D., Kim, H., Clark, A., Lee, G., Billinghurst, M. (2014) Grasp-Shell vs Gesture-Speech: A Comparison of Direct and Indirect Natural Interaction Techniques in Augmented Reality. Munich, Germany: 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 10-12 Sep 2014. 73-82.
    This citation is automatically generated and may be unreliable. Use as a guide only.
    Keywords
    Augmented reality; natural interaction; multi-modal interface
    ANZSRC Fields of Research
    08 - Information and Computing Sciences::0801 - Artificial Intelligence and Image Processing
    08 - Information and Computing Sciences::0806 - Information Systems::080602 - Computer-Human Interaction
    Rights
    https://hdl.handle.net/10092/17651

    Related items

    Showing items related by title, author, creator and subject.

    • Poster: Physically-Based Natural Hand and Tangible AR Interaction for Face-to-Face Collaboration on a Tabletop 

      Piumsomboon, T.; Clark, A.; Umakatsu, A.; Billinghurst, Mark (University of Canterbury. Human Interface Technology Laboratory, 2012)
      In this paper, we present an AR framework that allows natural hand and tangible AR interaction for physically-based interaction and environment awareness to support face-to-face collaboration using Microsoft Kinect. Our ...
    • A Comparison of Surface and Motion User-Defined Gestures for Mobile Augmented Reality. 

      Dong Z; Piumsomboon T; Zhang J; Clark AJ; Bai H; Lindeman RW (ACM, 2020)
    • Real-time Visual Representations for Mixed Reality Remote Collaboration 

      Gao L; Bai H; Piumsomboon T; Lee G; Lindeman RW; Billinghurst, Mark (2017)
      We present a prototype Mixed Reality (MR) system with a hybrid interface to support remote collaboration between a local worker and a remote expert in a large-scale work space. By combining a low-resolution 3D point-cloud ...
    Advanced Search

    Browse

    All of the RepositoryCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThesis DisciplineThis CollectionBy Issue DateAuthorsTitlesSubjectsThesis Discipline

    Statistics

    View Usage Statistics
    • SUBMISSIONS
    • Research Outputs
    • UC Theses
    • CONTACTS
    • Send Feedback
    • +64 3 369 3853
    • ucresearchrepository@canterbury.ac.nz
    • ABOUT
    • UC Research Repository Guide
    • Copyright and Disclaimer
    • SUBMISSIONS
    • Research Outputs
    • UC Theses
    • CONTACTS
    • Send Feedback
    • +64 3 369 3853
    • ucresearchrepository@canterbury.ac.nz
    • ABOUT
    • UC Research Repository Guide
    • Copyright and Disclaimer