Cooperative work with wearable robotics designed as an “extended body” for the wearer has the potential to improve individual productivity regardless of the context. The final purpose of this research it to design a new communication method between the wearer and a wearable robot arm as they perform daily chores simultaneously. Among previous studies on wearable robot arms, very little quantify the magnitude of the impact of robot operation on attention distribution and psychological burden for the user. The present paper presents an approach based on the idea that the robot arm could understand human intentions by reading implicit instruction cues nested in the natural motion flow of the operator performing a task. The present paper describes an Inertial Measurement Unit (IMU) sensor data - deep learning approach that enables the robot arm to learn these cues. The validity of the method was evaluated on three indexes: implicit instruction estimation accuracy, secondary task completion quality, and cognitive burden for the wearer. Results showed considerable improvement on all these proposed axes compared to other explicit operation methods (such as voice instructions), along with better results than similar implicit instruction-based researches.