Collaborating with a Mobile Robot: An Augmented Reality Multimodal Interface (2008)
Type of ContentConference Contributions - Published
PublisherUniversity of Canterbury. Human Interface Technology Laboratory.
University of Canterbury. Mechanical Engineering.
AuthorsGreen, S.A., Chen, X.Q., Billinghurst, M., Chase, J.G.show all
We have created an infrastructure that allows a human to collaborate in a natural manner with a robotic system. In this paper we describe our system and its implementation with a mobile robot. In our prototype the human communicates with the mobile robot using natural speech and gestures, for example, by selecting a point in 3D space and saying “go here” or “go behind that”. The robot responds using speech so the human is able to understand its intentions and beliefs. Augmented Reality (AR) technology is used to facilitate natural use of gestures and provide a common 3D spatial reference for both the robot and human, thus providing a means for grounding of communication and maintaining spatial awareness. This paper first discusses related work then gives a brief overview of AR and its capabilities. The architectural design we have developed is outlined and then a case study is discussed.
CitationGreen, S.A., Chen, X.Q., Billinghurst, M., Chase, J.G. (2008) Collaborating with a Mobile Robot: An Augmented Reality Multimodal Interface. Seoul, Korea: 17th International Federation of Automatic Control World Congress 2008, 6-11 Jul 2008.
This citation is automatically generated and may be unreliable. Use as a guide only.