Comparing Pointing and Drawing for Remote Collaboration (2013)
Type of ContentConference Contributions - Published
PublisherUniversity of Canterbury. Human Interface Technology Laboratory
In this research, we explore using pointing and drawing in a remote collaboration system. Our application allows a local user with a tablet to communicate with a remote expert on a desktop computer. We compared performance in four conditions: (1) Pointers on Still Image, (2) Pointers on Live Video, (3) Annotation on Still Image, and (4) Annotation on Live Video. We found that using drawing annotations would require fewer inputs on an expert side, and would require less cognitive load on the local worker side. In a follow-on study we compared the conditions (2) and (4) using a more complicated task. We found that pointing input requires good verbal communication to be effective and that drawing annotations need to be erased after completing each step of a task.
CitationKim, S., Lee, G., Sakata, N., Vartiainen, E., Billinghurst, M. (2013) Comparing Pointing and Drawing for Remote Collaboration. Adelaide, Australia: International Symposium on Mixed and Augmented Reality (ISMAR), 1-4 Oct 2013. 1-6.
This citation is automatically generated and may be unreliable. Use as a guide only.
Keywordsvideo conferencing; augmented reality
ANZSRC Fields of Research10 - Technology::1005 - Communications Technologies::100509 - Video Communications
08 - Information and Computing Sciences::0801 - Artificial Intelligence and Image Processing::080111 - Virtual Reality and Related Simulation
Showing items related by title, author, creator and subject.
Gao L; Bai H; Piumsomboon T; Lee G; Lindeman RW; Billinghurst, Mark (2017)We present a prototype Mixed Reality (MR) system with a hybrid interface to support remote collaboration between a local worker and a remote expert in a large-scale work space. By combining a low-resolution 3D point-cloud ...
Tait, M.; Tsai, T.; Sakata, N.; Billinghurst, Mark; Vartiainen, E. (University of Canterbury. Electrical and Computer EngineeringUniversity of Canterbury. Human Interface Technology Laboratory, 2013)This paper describes an AR system for remote collaboration using a captured 3D model of the local user’s scene. In the system a remote user can manipulate the scene independently of the view of the local user and add ...
Kim, S.; Lee, G.; Sakata, N.; Duenser, A.; Vartiainen, E.; Billinghurst, Mark (University of Canterbury. Human Interface Technology Laboratory, 2013)In this research, we explore how different types of augmented gesture communication cues can be used under different view sharing techniques in a remote collaboration system. In a pilot study, we compared four conditions: ...