Detecting features of tools, objects, and actions from effects in a robot using deep learning

Namiko Saito, Kitae Kim, Shingo Murata, Tetsuya Ogata, Shigeki Sugano

研究成果: Conference contribution

抄録

We propose a tool-use model that can detect the features of tools, target objects, and actions from the provided effects of object manipulation. We construct a model that enables robots to manipulate objects with tools, using infant learning as a concept. To realize this, we train sensory-motor data recorded during a tool-use task performed by a robot with deep learning. Experiments include four factors: (1) tools, (2) objects, (3) actions, and (4) effects, which the model considers simultaneously. For evaluation, the robot generates predicted images and motions given information of the effects of using unknown tools and objects. We confirm that the robot is capable of detecting features of tools, objects, and actions by learning the effects and executing the task.

本文言語English
ホスト出版物のタイトル2018 Joint IEEE 8th International Conference on Development and Learning and Epigenetic Robotics, ICDL-EpiRob 2018
出版社Institute of Electrical and Electronics Engineers Inc.
ページ91-96
ページ数6
ISBN(電子版)9781538661109
DOI
出版ステータスPublished - 2018 9
イベントJoint 8th IEEE International Conference on Development and Learning and Epigenetic Robotics, ICDL-EpiRob 2018 - Tokyo, Japan
継続期間: 2018 9 162018 9 20

出版物シリーズ

名前2018 Joint IEEE 8th International Conference on Development and Learning and Epigenetic Robotics, ICDL-EpiRob 2018

Conference

ConferenceJoint 8th IEEE International Conference on Development and Learning and Epigenetic Robotics, ICDL-EpiRob 2018
CountryJapan
CityTokyo
Period18/9/1618/9/20

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition
  • Control and Optimization
  • Behavioral Neuroscience
  • Developmental Neuroscience
  • Artificial Intelligence

フィンガープリント 「Detecting features of tools, objects, and actions from effects in a robot using deep learning」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル