Dynamic hand gesture recognition for robot arm teaching based on improved LRCN model

Kaixiang Luan, Takafumi Matsumaru

研究成果: Conference contribution

1 被引用数 (Scopus)

抄録

In this research, we focus on finding a new method of human-robot interaction in industrial environment. A vision-based dynamic hand gestures recognition system has been proposed for robot arm picking task. 8 dynamic hand gestures are captured for this task with a 100fps high speed camera. Based on the LRCN model, we combine the MobileNets (V2) and LSTM for this task, the MobileNets (V2) for extracting the image features and recognize the gestures, then, Long Short-Term Memory (LSTM) architecture for interpreting the features across time steps. Around 100 samples are taken for each gesture for training at first, then, the samples are augmented to 200 samples per gesture by data augmentation. Result shows that the model is able to learn the gestures varying in duration and complexity and gestures can be recognized in 88ms with 90.62% accuracy in the experiment on our hand gesture dataset.

本文言語English
ホスト出版物のタイトルIEEE International Conference on Robotics and Biomimetics, ROBIO 2019
出版社Institute of Electrical and Electronics Engineers Inc.
ページ1269-1274
ページ数6
ISBN(電子版)9781728163215
DOI
出版ステータスPublished - 2019 12
イベント2019 IEEE International Conference on Robotics and Biomimetics, ROBIO 2019 - Dali, China
継続期間: 2019 12 62019 12 8

出版物シリーズ

名前IEEE International Conference on Robotics and Biomimetics, ROBIO 2019

Conference

Conference2019 IEEE International Conference on Robotics and Biomimetics, ROBIO 2019
CountryChina
CityDali
Period19/12/619/12/8

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Science Applications
  • Hardware and Architecture
  • Mechanical Engineering
  • Control and Optimization

フィンガープリント 「Dynamic hand gesture recognition for robot arm teaching based on improved LRCN model」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル