• Admin
    UC Research Repository
    View Item 
       
    • UC Home
    • Library
    • UC Research Repository
    • College of Engineering
    • Engineering: Conference Contributions
    • View Item
       
    • UC Home
    • Library
    • UC Research Repository
    • College of Engineering
    • Engineering: Conference Contributions
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

    All of the RepositoryCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

    Statistics

    View Usage Statistics

    Evaluating the Augmented Reality Human-Robot Collaboration System

    Thumbnail
    View/Open
    12611910_Eval AR human robot collab - PRINTED paper.pdf (724.0Kb)
    Author
    Green, S.A.
    Chase, J.G.
    Chen, X.Q.
    Billinghurst, M.
    Date
    2008
    Permanent Link
    http://hdl.handle.net/10092/2210

    This paper discusses an experimental comparison of three user interface techniques for interaction with a mobile robot located remotely from the user. A typical means of operating a robot in such a situation is to teleoperate the robot using visual cues from a camera that displays the robot’s view of its work environment. However, the operator often has a difficult time maintaining awareness of the robot in its surroundings due to this single ego-centric view. Hence, a multi-modal system has been developed that allows the remote human operator to view the robot in its work environment through an Augmented Reality (AR) interface. The operator is able to use spoken dialog, reach into the 3D graphic representation of the work environment and discuss the intended actions of the robot to create a true collaboration. This study compares the typical ego-centric driven view to two versions of an AR interaction system for an experiment remotely operating a simulated mobile robot. One interface provides an immediate response from the remotely located robot. In contrast, the Augmented Reality Human-Robot Collaboration (AR-HRC) System interface enables the user to discuss and review a plan with the robot prior to execution. The AR-HRC interface was most effective, increasing accuracy by 30% with tighter variation, while reducing the number of close calls in operating the robot by factors of ~3x. It thus provides the means to maintain spatial awareness and give the users the feeling they were working in a true collaborative environment.

    Subjects
    Fields of Research::290000 Engineering and Technology::290500 Mechanical and Industrial Engineering::290501 Mechanical engineering
     
    Fields of Research::280000 Information, Computing and Communication Sciences::280100 Information Systems::280105 Interfaces and presentation (excl. 280104)
     
    Fields of Research::280000 Information, Computing and Communication Sciences::280200 Artificial Intelligence and Signal and Image Processing::280209 Intelligent robotics
    Collections
    • Engineering: Conference Contributions [2012]
    Rights
    https://hdl.handle.net/10092/17651

    UC Research Repository
    University Library
    University of Canterbury
    Private Bag 4800
    Christchurch 8140

    Phone
    364 2987 ext 8718

    Email
    ucresearchrepository@canterbury.ac.nz

    Follow us
    FacebookTwitterYoutube

    © University of Canterbury Library
    Send Feedback | Contact Us