3D Natural Hand Interaction for AR Applications

Type of content
Conference Contributions - Published
Thesis discipline
Degree name
Publisher
University of Canterbury. Human Interface Technology Laboratory.
Journal Title
Journal ISSN
Volume Title
Language
Date
2008
Authors
Lee, M.
Green, R.
Billinghurst, Mark
Abstract

In this paper we describe a computer vision-based 3D hand tracking system for a multimodal Augmented Reality (AR) interface. We have developed a 3D vision-based natural hand interaction method. This consists of four steps: (1) Skin colour segmentation, (2) Feature point finding, (3) Hand direction calculation, and (4) Simple collision detection based on a short finger ray for interaction between the user’s hand and augmented objects. The resulting fingertip tracking accuracy varied from 3mm to 20mm depending on the distance between the user’s hand and the stereo camera. We describe how this hand tracking is applied in three AR applications which merge gesture and speech input.

Description
Citation
Lee, M., Green, R., Billinghurst, M. (2008) 3D Natural Hand Interaction for AR Applications. Lincoln, New Zealand: 23rd International Conference Image and Vision Computing New Zealand, 26-28 Nov 2008.
Keywords
Natural hand interaction, augmented reality, multimodal interface, vision-based interaction
Ngā upoko tukutuku/Māori subject headings
ANZSRC fields of research
Rights