Robust Upper Body Pose Recognition in Unconstrained Environments Using Haar-Disparity

Type of content
Theses / Dissertations
Publisher's DOI/URI
Thesis discipline
Computer Science
Degree name
Master of Science
Publisher
University of Canterbury. Computer Science and Software Engineering
Journal Title
Journal ISSN
Volume Title
Language
Date
2008
Authors
Chu, Cheng-Tse
Abstract

In this research, an approach is proposed for the robust tracking of upper body movement in unconstrained environments by using a Haar- Disparity algorithm together with a novel 2D silhouette projection algorithm. A cascade of boosted Haar classifiers is used to identify human faces in video images, where a disparity map is then used to establish the 3D locations of detected faces. Based on this information, anthropometric constraints are used to define a semi-spherical interaction space for upper body poses. This constrained region serves the purpose of pruning the search space as well as validating user poses. Haar-Disparity improves on the traditional skin manifold tracking by relaxing constraints on clothing, background and illumination. The 2D silhouette projection algorithm provides three orthogonal views of the 3D objects. This allows tracking of upper limbs to be performed in the 2D space as opposed to manipulating 3D noisy data directly. This thesis also proposes a complete optimal set of interactions for very large interactive displays. Experimental evaluation includes the performance of alternative camera positions and orientations, accuracy of pointing, direct manipulative gestures, flag semaphore emulation, and principal axes. As a minor part of this research interest, the usability of interacting using only arm gestures is also evaluated based on ISO 9241-9 standard. The results suggest that the proposed algorithm and optimal set of interactions are useful for interacting with large displays.

Description
Citation
Keywords
computer vision, gesture recognition, human-computer interation
Ngā upoko tukutuku/Māori subject headings
ANZSRC fields of research
Rights
Copyright Cheng-Tse Chu