3D Natural Hand Interaction for AR Applications (2008)
Type of ContentConference Contributions - Published
PublisherUniversity of Canterbury. Human Interface Technology Laboratory.
AuthorsLee, M., Green, R., Billinghurst, M.show all
In this paper we describe a computer vision-based 3D hand tracking system for a multimodal Augmented Reality (AR) interface. We have developed a 3D vision-based natural hand interaction method. This consists of four steps: (1) Skin colour segmentation, (2) Feature point finding, (3) Hand direction calculation, and (4) Simple collision detection based on a short finger ray for interaction between the user’s hand and augmented objects. The resulting fingertip tracking accuracy varied from 3mm to 20mm depending on the distance between the user’s hand and the stereo camera. We describe how this hand tracking is applied in three AR applications which merge gesture and speech input.
CitationLee, M., Green, R., Billinghurst, M. (2008) 3D Natural Hand Interaction for AR Applications. Lincoln, New Zealand: 23rd International Conference Image and Vision Computing New Zealand, 26-28 Nov 2008.
This citation is automatically generated and may be unreliable. Use as a guide only.