Tag it!: AR annotation using wearable sensors

Type of content
Conference Contributions - Published
Thesis discipline
Degree name
Publisher
University of Canterbury. Human Interface Technology Laboratory
Journal Title
Journal ISSN
Volume Title
Language
Date
2015
Authors
Nassani, A.
Bai, H.
Lee, G.
Billinghurst, Mark
Abstract

In this paper we describe a wearable system that allows people to place and interact with 3D virtual tags placed around them. This uses two wearable technologies: a head-worn wearable computer (Google Glass) and a chest-worn depth sensor (Tango). The Google Glass is used to generate and display virtual information to the user, while the Tango is used to provide robust indoor position tracking for the Glass. The Tango enables spatial awareness of the surrounding world using various motion sensors including 3D depth sensing, an accelerometer and a motion tracking camera. Using these systems together allows users to create a virtual tag via voice input and then register this tag to a physical object or position in 3D space as an augmented annotation. We describe the design and implementation of the system, user feedback, research implications, and directions for future work.

Description
Citation
Nassani, A., Bai, H., Lee, G., Billinghurst, M. (2015) Tag it!: AR annotation using wearable sensors. Kobe, Japan: SIGGRAPH Asia 2015 Symposium on Mobile Graphics and Interactive Applications, 2-5 Nov 2015. Article No.12, 1-4.
Keywords
Augmented Reality, virtual tag, wearable technology
Ngā upoko tukutuku/Māori subject headings
ANZSRC fields of research
Field of Research::08 - Information and Computing Sciences::0801 - Artificial Intelligence and Image Processing::080111 - Virtual Reality and Related Simulation
Rights