Abstract
During robot teleoperations, misalignments occur when the coordinates of the manipulated end effector viewed through a remote camera are incongruent with the fixed coordinates at the hand controller. Such display-control misalignments are frequently encountered in present-day manually controlled robotic systems, such as space station manipulators or mobile manipulators for military operations and surgical robots with which the remote worksite is viewed from an exocentric camera. Misalignments are known to degrade operator performance and thus far have been dealt with by extensive operator training or by system-specific compensatory algorithms. The results presented demonstrate that the time to produce translational and rotational movements on an end effector viewed through an exocentric camera varies significantly with misalignment and follows patterns similar to those found in mental rotation studies of misaligned shapes. The authors show that overlaying color-coded augmented-reality movement cues on the end effector and mapping them to a color-coded hand controller significantly shorten the processing time required to translate or rotate a misaligned end effector across a range of angular misalignments. Furthermore, the cues make response times invariant to misalignment for both translational and rotational movements. Although these cues were intended for space station manipulators, their application in other teleoperation platforms, such as surgical robots or remote ground vehicles, is possible.
Keywords
Get full access to this article
View all access options for this article.
