University of Canterbury Home
    • Admin
    UC Research Repository
    UC Library
    JavaScript is disabled for your browser. Some features of this site may not work without it.
    View Item 
    1. UC Home
    2. Library
    3. UC Research Repository
    4. Faculty of Engineering | Te Kaupeka Pūhanga
    5. Engineering: Theses and Dissertations
    6. View Item
    1. UC Home
    2.  > 
    3. Library
    4.  > 
    5. UC Research Repository
    6.  > 
    7. Faculty of Engineering | Te Kaupeka Pūhanga
    8.  > 
    9. Engineering: Theses and Dissertations
    10.  > 
    11. View Item

    Multimodal Speech-Gesture Interaction with 3D Objects in Augmented Reality Environments (2010)

    Thumbnail
    View/Open
    thesis_fulltext.pdf (1.661Mb)
    Type of Content
    Theses / Dissertations
    UC Permalink
    http://hdl.handle.net/10092/4094
    http://dx.doi.org/10.26021/2223
    
    Thesis Discipline
    Computer Science
    Degree Name
    Doctor of Philosophy
    Publisher
    University of Canterbury. Department of Computer Science and Software Engineering
    Collections
    • Engineering: Theses and Dissertations [2949]
    Authors
    Lee, Minkyung
    show all
    Abstract

    Augmented Reality (AR) has the possibility of interacting with virtual objects and real objects at the same time since it combines the real world with computer-generated contents seamlessly. However, most AR interface research uses general Virtual Reality (VR) interaction techniques without modification. In this research we develop a multimodal interface (MMI) for AR with speech and 3D hand gesture input. We develop a multimodal signal fusion architecture based on the user behaviour while interacting with the MMI that provides more effective and natural multimodal signal fusion. Speech and 3D vision-based free hand gestures are used as multimodal input channels. There were two user observations (1) a Wizard of Oz study and (2)Gesture modelling. With the Wizard of Oz study, we observed user behaviours of interaction with our MMI. Gesture modelling was undertaken to explore whether different types of gestures can be described by pattern curves. Based on the experimental observations, we designed our own multimodal fusion architecture and developed an MMI. User evaluations have been conducted to evaluate the usability of our MMI. As a result, we found that MMI is more efficient and users are more satisfied with it when compared to the unimodal interfaces. We also describe design guidelines which were derived from our findings through the user studies.

    Keywords
    augmented reality; multimodal interface; natural hand gesture; gesture-speech input; multimodal fusion
    Rights
    Copyright Min Kyung Lee
    https://canterbury.libguides.com/rights/theses

    Related items

    Showing items related by title, author, creator and subject.

    • An Evaluation of an Augmented Reality Multimodal Interface Using Speech and Paddle Gestures 

      Irawati, S.; Green, S.; Billinghurst, Mark; Duenser, A.; Ko, H. (University of Canterbury. Human Interface Technology Laboratory., 2006)
      This paper discusses an evaluation of an augmented reality (AR) multimodal interface that uses combined speech and paddle gestures for interaction with virtual objects in the real world. We briefly describe our AR ...
    • Grasp-Shell vs Gesture-Speech: A Comparison of Direct and Indirect Natural Interaction Techniques in Augmented Reality 

      Piumsomboon, T.; Altimira, D.; Kim, H.; Clark, A.; Lee, G.; Billinghurst, Mark (University of Canterbury. Human Interface Technology Laboratory, 2014)
      In order for natural interaction in Augmented Reality (AR) to become widely adopted, the techniques used need to be shown to support precise interaction, and the gestures used proven to be easy to understand and perform. ...
    • Freeze view touch and finger gesture based interaction methods for handheld augmented reality interfaces 

      Bai, H.; Lee, G.A.; Billinghurst, Mark (University of Canterbury. Human Interface Technology Laboratory, 2012)
      Interaction techniques for handheld mobile Augmented Reality (AR) often focus on device-centric methods based around touch input. However, users may not be able to easily interact with virtual objects in mobile AR scenes ...
    Advanced Search

    Browse

    All of the RepositoryCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThesis DisciplineThis CollectionBy Issue DateAuthorsTitlesSubjectsThesis Discipline

    Statistics

    View Usage Statistics
    • SUBMISSIONS
    • Research Outputs
    • UC Theses
    • CONTACTS
    • Send Feedback
    • +64 3 369 3853
    • ucresearchrepository@canterbury.ac.nz
    • ABOUT
    • UC Research Repository Guide
    • Copyright and Disclaimer
    • SUBMISSIONS
    • Research Outputs
    • UC Theses
    • CONTACTS
    • Send Feedback
    • +64 3 369 3853
    • ucresearchrepository@canterbury.ac.nz
    • ABOUT
    • UC Research Repository Guide
    • Copyright and Disclaimer