Natural hand interaction for augmented reality.
Thesis DisciplineComputer Science
Degree GrantorUniversity of Canterbury
Degree NameDoctor of Philosophy
Despite the increasing prevalence of Augmented Reality (AR) interfaces, there is still a lack of interaction techniques that allow full utilization of the medium. Natural hand interaction has the potential to offer these affordances however, as of yet, has not been well explored. The aim of this thesis is to improve the understanding of natural hand interaction and ultimately create a novel natural hand interaction technique that enhances user experience when interacting with AR. To better understand natural hand interaction, two prototype AR systems featuring environmental awareness and physical simulation were developed, one featuring interaction on a tabletop, and the other in a mobile tablet setting. Observations and feedback from public demonstrations of the systems were collected, and it was found that users felt that interacting physically using their hands and other tangible objects was natural and intuitive. Following this, a guessability study was conducted to elicit hand gestures for AR and obtain qualitative feedback from users in a video-see through head mounted display (HMD). From the results, a user-defined gesture set was created to guide the design of natural hand interaction for AR. Utilizing this deeper understanding and set of design guidelines, a gesture interface was developed that enabled hand tracking and gesture recognition based on depth sensing input. An AR framework that supports natural interaction as the primary input, called G-SIAR, was created, and a novel direct manipulation natural hand interaction technique, Grasp-Shell (G-Shell), was developed. This interaction technique was validated by comparing it to a traditional indirect manipulation gesture and speech interaction technique, Gesture-Speech (G-Speech), in a usability study. From the study, we gained insights into the strengths and weaknesses of each interaction technique. We found impacts on performance, usability, and user preference when comparing G-Shell’s direct interaction, where the user physically manipulates the object they are interacting with, and G-Speech’s indirect interaction, where the user interacts with the object remotely using gestures and speech commands, depending on the task. We concluded that these interaction techniques were complementing each other and should be offered together. The primary contributions of this thesis include a literature review of AR and its interaction techniques, the implementation of two AR systems and findings from the public demonstrations, findings from a guessability study on hand gestures for AR, the design and development of gesture interface and multimodal AR framework, and the design and evaluation of two natural interaction techniques, G-Shell and G-Speech. This research offers knowledge gained into natural hand interaction for AR and forms a new layer of foundation for research into interaction techniques in AR.