Tag it!: AR annotation using wearable sensors (2015)
Type of ContentConference Contributions - Published
PublisherUniversity of Canterbury. Human Interface Technology Laboratory
AuthorsNassani, A., Bai, H., Lee, G., Billinghurst, M.show all
In this paper we describe a wearable system that allows people to place and interact with 3D virtual tags placed around them. This uses two wearable technologies: a head-worn wearable computer (Google Glass) and a chest-worn depth sensor (Tango). The Google Glass is used to generate and display virtual information to the user, while the Tango is used to provide robust indoor position tracking for the Glass. The Tango enables spatial awareness of the surrounding world using various motion sensors including 3D depth sensing, an accelerometer and a motion tracking camera. Using these systems together allows users to create a virtual tag via voice input and then register this tag to a physical object or position in 3D space as an augmented annotation. We describe the design and implementation of the system, user feedback, research implications, and directions for future work.
CitationNassani, A., Bai, H., Lee, G., Billinghurst, M. (2015) Tag it!: AR annotation using wearable sensors. Kobe, Japan: SIGGRAPH Asia 2015 Symposium on Mobile Graphics and Interactive Applications, 2-5 Nov 2015. Article No.12, 1-4.
This citation is automatically generated and may be unreliable. Use as a guide only.