Real-time Visual Representations for Mixed Reality Remote Collaboration (2017)

View/ Open
Type of Content
Conference Contributions - PublishedCollections
Abstract
We present a prototype Mixed Reality (MR) system with a hybrid interface to support remote collaboration between a local worker and a remote expert in a large-scale work space. By combining a low-resolution 3D point-cloud of the environment surrounding the local worker with a high-resolution real-time view of small focused details, the remote expert can see a virtual copy of the local workspace with an independent viewpoint control. Meanwhile, the export can also check the current actions of the local worker through a real-time feedback view. We conducted a pilot study to evaluate the usability of our system by comparing the performance of three different interface designs (showing the real-time view in forms of 2D first-person view, a 2D third-person view and a 3D point cloud view). We found no difference in average task performance time between the three interfaces, but there was a difference in user preference.
Citation
Gao L, Bai H, Piumsomboon T, Lee G, Lindeman RW, Billinghurst M (2017). Real-time Visual Representations for Mixed Reality Remote Collaboration. ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence & Eurographics Symposium on Virtual Environments.This citation is automatically generated and may be unreliable. Use as a guide only.
Keywords
Human-centered computing; Mixed / augmented reality; Virtual reality; Collaborative and social computing design and evaluation methodsANZSRC Fields of Research
08 - Information and Computing Sciences::0801 - Artificial Intelligence and Image Processing::080111 - Virtual Reality and Related Simulation08 - Information and Computing Sciences::0806 - Information Systems::080602 - Computer-Human Interaction
47 - Language, communication and culture::4701 - Communication and media studies::470102 - Communication technology and digital media studies
Related items
Showing items related by title, author, creator and subject.
-
Poster: Physically-Based Natural Hand and Tangible AR Interaction for Face-to-Face Collaboration on a Tabletop
Piumsomboon, T.; Clark, A.; Umakatsu, A.; Billinghurst, Mark (University of Canterbury. Human Interface Technology Laboratory, 2012)In this paper, we present an AR framework that allows natural hand and tangible AR interaction for physically-based interaction and environment awareness to support face-to-face collaboration using Microsoft Kinect. Our ... -
A Systematic Review of 10 Years of Augmented Reality Usability Studies: 2005 to 2014
Dey A; Billinghurst, Mark; Lindeman RW; Swan E (2018)Augmented Reality (AR) interfaces have been studied extensively over the last few decades, with a growing number of user-based experiments. In this paper, we systematically review 10 years of the most influential AR user ... -
A Comparison of Surface and Motion User-Defined Gestures for Mobile Augmented Reality.
Dong Z; Piumsomboon T; Zhang J; Clark AJ; Bai H; Lindeman RW (ACM, 2020)