Mobile Augmented Reality: Free-hand Gesture-based Interaction
Thesis DisciplineComputer Science
Degree GrantorUniversity of Canterbury
Degree NameDoctor of Philosophy
The primary goal of this thesis is to design, implement and evaluate novel interaction techniques for enhancing Augmented Reality (AR) on mobile platforms. The motivation for this research comes from the need for more efficient and instinctive interaction approaches for mobile AR systems. With a mobile AR framework including natural feature tracking and natural gesture interfaces developed on the mobile and wearable devices, we discuss advantages and limitations of free-hand gesture for mobile AR interaction, and evaluate all our designs with users on common tasks of AR manipulations.
In this thesis, we apply typical Human-Computer Interaction (HCI) methodologies to tackle this discussion. In particular, we developed two types of markerless gesture-based input technologies in self-contained and client-server design on handheld mobile AR implementations, and one type of multi-modal interface with one wearable mobile AR implementation. We studied these interfaces separately with corresponding user studies on our mobile AR systems to analyze the usability of our proposed free-hand gesture-based interfaces, and to investigate how the interaction can be improved as well.
The results from our evaluations show that natural gesture interaction offers intuitive interaction metaphors for manipulating mobile AR scenes while being as easy to learn and use as the traditional touch-based methods. Although it is not very suitable for long periods of use because of the physical stress, the free-hand gesturebased interaction provides a more fun and engaging user experience. Meanwhile, its performance can be improved by combining the touch-based input method for a more precise operation in task refinement. Our contribution supports interface designers in making informed design decisions for mobile AR systems with free-hand gesture-based interfaces.