This paper introduces a novel interface ‘3D head pointer’ for the operation of a wearable robotic arm in 3D space. The developed system is intended to assist its user in the execution of routine tasks while operating a robotic arm. Previous studies have demonstrated the difficulty a user faces in simultaneously controlling a robotic arm and their own hands. The proposed method combines a head-based pointing device and voice recognition to manipulate the position and orientation as well as to switch between these two modes. In a virtual reality environment, the position instructions of the proposed system and its usefulness were evaluated by measuring the accuracy of the instructions and the time required using a fully immersive head-mounted display (HMD). In addition, the entire system, including posture instructions with two switching methods (voice recognition and head gestures), was evaluated using an optical transparent HMD. The obtained results displayed an accuracy of 1.25 cm and 3.56 ° with the 20-s time span necessary for communicating an instruction. These results demonstrate that voice recognition is a more effective switching method than head gestures.
ASJC Scopus subject areas