Automatic zooming interface for tangible augmented reality applications (2012)

Type of Content
Conference Contributions - PublishedPublisher
University of Canterbury. Human Interface Technology LaboratoryCollections
Abstract
Tangible Augmented Reality (AR) interfaces use physical objects as a medium for interacting with virtual objects. In many cases, they track physical objects using computer vision techniques to attach corresponding virtual objects on them. However, when a user tries to have a closer look at the virtual content, the tracking can fail as the viewpoint gets too close to the physical object. To prevent this, we propose an automatic zooming method that helps users to achieve a closer view to the scene without losing tracking. By updating the zoom factor based on the distance between the viewpoint and the target object, a natural and intuitive zooming interaction is achieved. In a user study evaluating the technique, we found that the proposed method is not only effective but also easy and natural to use.
Citation
Lee, G.A., Bai, H., Billinghurst, M. (2012) Automatic zooming interface for tangible augmented reality applications. Singapore: 11th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and its Applications in Industry (VRCAI'12), 2-4 Dec 2012. Proceedings, 9-12.This citation is automatically generated and may be unreliable. Use as a guide only.
Keywords
tangible augmented reality; interaction method; zooming interfaceANZSRC Fields of Research
08 - Information and Computing Sciences::0806 - Information Systems::080602 - Computer-Human Interaction08 - Information and Computing Sciences::0801 - Artificial Intelligence and Image Processing
Rights
“Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with Revised 4/2/2013 3 credit is permitted. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.” Related items
Showing items related by title, author, creator and subject.
-
Poster: Physically-Based Natural Hand and Tangible AR Interaction for Face-to-Face Collaboration on a Tabletop
Piumsomboon, T.; Clark, A.; Umakatsu, A.; Billinghurst, Mark (University of Canterbury. Human Interface Technology Laboratory, 2012)In this paper, we present an AR framework that allows natural hand and tangible AR interaction for physically-based interaction and environment awareness to support face-to-face collaboration using Microsoft Kinect. Our ... -
Grasp-Shell vs Gesture-Speech: A Comparison of Direct and Indirect Natural Interaction Techniques in Augmented Reality
Piumsomboon, T.; Altimira, D.; Kim, H.; Clark, A.; Lee, G.; Billinghurst, Mark (University of Canterbury. Human Interface Technology Laboratory, 2014)In order for natural interaction in Augmented Reality (AR) to become widely adopted, the techniques used need to be shown to support precise interaction, and the gestures used proven to be easy to understand and perform. ... -
Real-time Visual Representations for Mixed Reality Remote Collaboration
Gao L; Bai H; Piumsomboon T; Lee G; Lindeman RW; Billinghurst, Mark (2017)We present a prototype Mixed Reality (MR) system with a hybrid interface to support remote collaboration between a local worker and a remote expert in a large-scale work space. By combining a low-resolution 3D point-cloud ...