TY - GEN
T1 - Dynamic hand gesture recognition for robot arm teaching based on improved LRCN model
AU - Luan, Kaixiang
AU - Matsumaru, Takafumi
N1 - Funding Information:
This research is supported by Japan Society for The Promotion of Science (KAKENHI-PROJECT-17K06277), to which we would like to express our sincere gratitude.
Publisher Copyright:
© 2019 IEEE.
PY - 2019/12
Y1 - 2019/12
N2 - In this research, we focus on finding a new method of human-robot interaction in industrial environment. A vision-based dynamic hand gestures recognition system has been proposed for robot arm picking task. 8 dynamic hand gestures are captured for this task with a 100fps high speed camera. Based on the LRCN model, we combine the MobileNets (V2) and LSTM for this task, the MobileNets (V2) for extracting the image features and recognize the gestures, then, Long Short-Term Memory (LSTM) architecture for interpreting the features across time steps. Around 100 samples are taken for each gesture for training at first, then, the samples are augmented to 200 samples per gesture by data augmentation. Result shows that the model is able to learn the gestures varying in duration and complexity and gestures can be recognized in 88ms with 90.62% accuracy in the experiment on our hand gesture dataset.
AB - In this research, we focus on finding a new method of human-robot interaction in industrial environment. A vision-based dynamic hand gestures recognition system has been proposed for robot arm picking task. 8 dynamic hand gestures are captured for this task with a 100fps high speed camera. Based on the LRCN model, we combine the MobileNets (V2) and LSTM for this task, the MobileNets (V2) for extracting the image features and recognize the gestures, then, Long Short-Term Memory (LSTM) architecture for interpreting the features across time steps. Around 100 samples are taken for each gesture for training at first, then, the samples are augmented to 200 samples per gesture by data augmentation. Result shows that the model is able to learn the gestures varying in duration and complexity and gestures can be recognized in 88ms with 90.62% accuracy in the experiment on our hand gesture dataset.
KW - Deep Learning
KW - Gesture recognition
KW - LSTM
KW - Robot teaching
KW - Robotics picking
UR - http://www.scopus.com/inward/record.url?scp=85079029658&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85079029658&partnerID=8YFLogxK
U2 - 10.1109/ROBIO49542.2019.8961787
DO - 10.1109/ROBIO49542.2019.8961787
M3 - Conference contribution
AN - SCOPUS:85079029658
T3 - IEEE International Conference on Robotics and Biomimetics, ROBIO 2019
SP - 1269
EP - 1274
BT - IEEE International Conference on Robotics and Biomimetics, ROBIO 2019
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2019 IEEE International Conference on Robotics and Biomimetics, ROBIO 2019
Y2 - 6 December 2019 through 8 December 2019
ER -