University of Canterbury Home
    • Admin
    UC Research Repository
    UC Library
    JavaScript is disabled for your browser. Some features of this site may not work without it.
    View Item 
    1. UC Home
    2. Library
    3. UC Research Repository
    4. Faculty of Engineering | Te Kaupeka Pūhanga
    5. Engineering: Conference Contributions
    6. View Item
    1. UC Home
    2.  > 
    3. Library
    4.  > 
    5. UC Research Repository
    6.  > 
    7. Faculty of Engineering | Te Kaupeka Pūhanga
    8.  > 
    9. Engineering: Conference Contributions
    10.  > 
    11. View Item

    Automatic zooming interface for tangible augmented reality applications (2012)

    Thumbnail
    View/Open
    12642824_vrcai2012-lee_a-autozoom.pdf (1.377Mb)
    Type of Content
    Conference Contributions - Published
    UC Permalink
    http://hdl.handle.net/10092/8835
    
    Publisher's DOI/URI
    https://doi.org/10.1145/2407516.2407518
    
    Publisher
    University of Canterbury. Human Interface Technology Laboratory
    Collections
    • Engineering: Conference Contributions [2338]
    Authors
    Lee, G.A.
    Bai, H.
    Billinghurst, Mark cc
    show all
    Abstract

    Tangible Augmented Reality (AR) interfaces use physical objects as a medium for interacting with virtual objects. In many cases, they track physical objects using computer vision techniques to attach corresponding virtual objects on them. However, when a user tries to have a closer look at the virtual content, the tracking can fail as the viewpoint gets too close to the physical object. To prevent this, we propose an automatic zooming method that helps users to achieve a closer view to the scene without losing tracking. By updating the zoom factor based on the distance between the viewpoint and the target object, a natural and intuitive zooming interaction is achieved. In a user study evaluating the technique, we found that the proposed method is not only effective but also easy and natural to use.

    Citation
    Lee, G.A., Bai, H., Billinghurst, M. (2012) Automatic zooming interface for tangible augmented reality applications. Singapore: 11th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and its Applications in Industry (VRCAI'12), 2-4 Dec 2012. Proceedings, 9-12.
    This citation is automatically generated and may be unreliable. Use as a guide only.
    Keywords
    tangible augmented reality; interaction method; zooming interface
    ANZSRC Fields of Research
    08 - Information and Computing Sciences::0806 - Information Systems::080602 - Computer-Human Interaction
    08 - Information and Computing Sciences::0801 - Artificial Intelligence and Image Processing
    Rights
    “Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with Revised 4/2/2013 3 credit is permitted. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.” 
    https://hdl.handle.net/10092/17651

    Related items

    Showing items related by title, author, creator and subject.

    • Poster: Physically-Based Natural Hand and Tangible AR Interaction for Face-to-Face Collaboration on a Tabletop 

      Piumsomboon, T.; Clark, A.; Umakatsu, A.; Billinghurst, Mark (University of Canterbury. Human Interface Technology Laboratory, 2012)
      In this paper, we present an AR framework that allows natural hand and tangible AR interaction for physically-based interaction and environment awareness to support face-to-face collaboration using Microsoft Kinect. Our ...
    • Grasp-Shell vs Gesture-Speech: A Comparison of Direct and Indirect Natural Interaction Techniques in Augmented Reality 

      Piumsomboon, T.; Altimira, D.; Kim, H.; Clark, A.; Lee, G.; Billinghurst, Mark (University of Canterbury. Human Interface Technology Laboratory, 2014)
      In order for natural interaction in Augmented Reality (AR) to become widely adopted, the techniques used need to be shown to support precise interaction, and the gestures used proven to be easy to understand and perform. ...
    • Real-time Visual Representations for Mixed Reality Remote Collaboration 

      Gao L; Bai H; Piumsomboon T; Lee G; Lindeman RW; Billinghurst, Mark (2017)
      We present a prototype Mixed Reality (MR) system with a hybrid interface to support remote collaboration between a local worker and a remote expert in a large-scale work space. By combining a low-resolution 3D point-cloud ...
    Advanced Search

    Browse

    All of the RepositoryCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThesis DisciplineThis CollectionBy Issue DateAuthorsTitlesSubjectsThesis Discipline

    Statistics

    View Usage Statistics
    • SUBMISSIONS
    • Research Outputs
    • UC Theses
    • CONTACTS
    • Send Feedback
    • +64 3 369 3853
    • ucresearchrepository@canterbury.ac.nz
    • ABOUT
    • UC Research Repository Guide
    • Copyright and Disclaimer
    • SUBMISSIONS
    • Research Outputs
    • UC Theses
    • CONTACTS
    • Send Feedback
    • +64 3 369 3853
    • ucresearchrepository@canterbury.ac.nz
    • ABOUT
    • UC Research Repository Guide
    • Copyright and Disclaimer