Detecting features of tools, objects, and actions from effects in a robot using deep learning

Namiko Saito, Kitae Kim, Shingo Murata, Tetsuya Ogata, Shigeki Sugano

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We propose a tool-use model that can detect the features of tools, target objects, and actions from the provided effects of object manipulation. We construct a model that enables robots to manipulate objects with tools, using infant learning as a concept. To realize this, we train sensory-motor data recorded during a tool-use task performed by a robot with deep learning. Experiments include four factors: (1) tools, (2) objects, (3) actions, and (4) effects, which the model considers simultaneously. For evaluation, the robot generates predicted images and motions given information of the effects of using unknown tools and objects. We confirm that the robot is capable of detecting features of tools, objects, and actions by learning the effects and executing the task.

Original languageEnglish
Title of host publication2018 Joint IEEE 8th International Conference on Development and Learning and Epigenetic Robotics, ICDL-EpiRob 2018
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages91-96
Number of pages6
ISBN (Electronic)9781538661109
DOIs
Publication statusPublished - 2018 Sep 1
EventJoint 8th IEEE International Conference on Development and Learning and Epigenetic Robotics, ICDL-EpiRob 2018 - Tokyo, Japan
Duration: 2018 Sep 162018 Sep 20

Publication series

Name2018 Joint IEEE 8th International Conference on Development and Learning and Epigenetic Robotics, ICDL-EpiRob 2018

Conference

ConferenceJoint 8th IEEE International Conference on Development and Learning and Epigenetic Robotics, ICDL-EpiRob 2018
CountryJapan
CityTokyo
Period18/9/1618/9/20

Fingerprint

Robot
Learning
Robots
Object
Deep learning
Manipulation
Model
Unknown
Target
Motion
Evaluation
Experiment
Experiments

Keywords

  • Cognitive Robotics
  • Development of Infants
  • Neural Network
  • Tool-use

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition
  • Control and Optimization
  • Behavioral Neuroscience
  • Developmental Neuroscience
  • Artificial Intelligence

Cite this

Saito, N., Kim, K., Murata, S., Ogata, T., & Sugano, S. (2018). Detecting features of tools, objects, and actions from effects in a robot using deep learning. In 2018 Joint IEEE 8th International Conference on Development and Learning and Epigenetic Robotics, ICDL-EpiRob 2018 (pp. 91-96). [8761029] (2018 Joint IEEE 8th International Conference on Development and Learning and Epigenetic Robotics, ICDL-EpiRob 2018). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/DEVLRN.2018.8761029

Detecting features of tools, objects, and actions from effects in a robot using deep learning. / Saito, Namiko; Kim, Kitae; Murata, Shingo; Ogata, Tetsuya; Sugano, Shigeki.

2018 Joint IEEE 8th International Conference on Development and Learning and Epigenetic Robotics, ICDL-EpiRob 2018. Institute of Electrical and Electronics Engineers Inc., 2018. p. 91-96 8761029 (2018 Joint IEEE 8th International Conference on Development and Learning and Epigenetic Robotics, ICDL-EpiRob 2018).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Saito, N, Kim, K, Murata, S, Ogata, T & Sugano, S 2018, Detecting features of tools, objects, and actions from effects in a robot using deep learning. in 2018 Joint IEEE 8th International Conference on Development and Learning and Epigenetic Robotics, ICDL-EpiRob 2018., 8761029, 2018 Joint IEEE 8th International Conference on Development and Learning and Epigenetic Robotics, ICDL-EpiRob 2018, Institute of Electrical and Electronics Engineers Inc., pp. 91-96, Joint 8th IEEE International Conference on Development and Learning and Epigenetic Robotics, ICDL-EpiRob 2018, Tokyo, Japan, 18/9/16. https://doi.org/10.1109/DEVLRN.2018.8761029
Saito N, Kim K, Murata S, Ogata T, Sugano S. Detecting features of tools, objects, and actions from effects in a robot using deep learning. In 2018 Joint IEEE 8th International Conference on Development and Learning and Epigenetic Robotics, ICDL-EpiRob 2018. Institute of Electrical and Electronics Engineers Inc. 2018. p. 91-96. 8761029. (2018 Joint IEEE 8th International Conference on Development and Learning and Epigenetic Robotics, ICDL-EpiRob 2018). https://doi.org/10.1109/DEVLRN.2018.8761029
Saito, Namiko ; Kim, Kitae ; Murata, Shingo ; Ogata, Tetsuya ; Sugano, Shigeki. / Detecting features of tools, objects, and actions from effects in a robot using deep learning. 2018 Joint IEEE 8th International Conference on Development and Learning and Epigenetic Robotics, ICDL-EpiRob 2018. Institute of Electrical and Electronics Engineers Inc., 2018. pp. 91-96 (2018 Joint IEEE 8th International Conference on Development and Learning and Epigenetic Robotics, ICDL-EpiRob 2018).
@inproceedings{e2dbfbe61ff74796b59df8a0a50fac09,
title = "Detecting features of tools, objects, and actions from effects in a robot using deep learning",
abstract = "We propose a tool-use model that can detect the features of tools, target objects, and actions from the provided effects of object manipulation. We construct a model that enables robots to manipulate objects with tools, using infant learning as a concept. To realize this, we train sensory-motor data recorded during a tool-use task performed by a robot with deep learning. Experiments include four factors: (1) tools, (2) objects, (3) actions, and (4) effects, which the model considers simultaneously. For evaluation, the robot generates predicted images and motions given information of the effects of using unknown tools and objects. We confirm that the robot is capable of detecting features of tools, objects, and actions by learning the effects and executing the task.",
keywords = "Cognitive Robotics, Development of Infants, Neural Network, Tool-use",
author = "Namiko Saito and Kitae Kim and Shingo Murata and Tetsuya Ogata and Shigeki Sugano",
year = "2018",
month = "9",
day = "1",
doi = "10.1109/DEVLRN.2018.8761029",
language = "English",
series = "2018 Joint IEEE 8th International Conference on Development and Learning and Epigenetic Robotics, ICDL-EpiRob 2018",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
pages = "91--96",
booktitle = "2018 Joint IEEE 8th International Conference on Development and Learning and Epigenetic Robotics, ICDL-EpiRob 2018",

}

TY - GEN

T1 - Detecting features of tools, objects, and actions from effects in a robot using deep learning

AU - Saito, Namiko

AU - Kim, Kitae

AU - Murata, Shingo

AU - Ogata, Tetsuya

AU - Sugano, Shigeki

PY - 2018/9/1

Y1 - 2018/9/1

N2 - We propose a tool-use model that can detect the features of tools, target objects, and actions from the provided effects of object manipulation. We construct a model that enables robots to manipulate objects with tools, using infant learning as a concept. To realize this, we train sensory-motor data recorded during a tool-use task performed by a robot with deep learning. Experiments include four factors: (1) tools, (2) objects, (3) actions, and (4) effects, which the model considers simultaneously. For evaluation, the robot generates predicted images and motions given information of the effects of using unknown tools and objects. We confirm that the robot is capable of detecting features of tools, objects, and actions by learning the effects and executing the task.

AB - We propose a tool-use model that can detect the features of tools, target objects, and actions from the provided effects of object manipulation. We construct a model that enables robots to manipulate objects with tools, using infant learning as a concept. To realize this, we train sensory-motor data recorded during a tool-use task performed by a robot with deep learning. Experiments include four factors: (1) tools, (2) objects, (3) actions, and (4) effects, which the model considers simultaneously. For evaluation, the robot generates predicted images and motions given information of the effects of using unknown tools and objects. We confirm that the robot is capable of detecting features of tools, objects, and actions by learning the effects and executing the task.

KW - Cognitive Robotics

KW - Development of Infants

KW - Neural Network

KW - Tool-use

UR - http://www.scopus.com/inward/record.url?scp=85070383722&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85070383722&partnerID=8YFLogxK

U2 - 10.1109/DEVLRN.2018.8761029

DO - 10.1109/DEVLRN.2018.8761029

M3 - Conference contribution

T3 - 2018 Joint IEEE 8th International Conference on Development and Learning and Epigenetic Robotics, ICDL-EpiRob 2018

SP - 91

EP - 96

BT - 2018 Joint IEEE 8th International Conference on Development and Learning and Epigenetic Robotics, ICDL-EpiRob 2018

PB - Institute of Electrical and Electronics Engineers Inc.

ER -