Experience based imitation using RNNPB

Ryunosuke Yokoya, Tetsuya Ogata, Jun Tani, Kazunori Komatani, Hiroshi G. Okuno

Research output: Chapter in Book/Report/Conference proceedingConference contribution

11 Citations (Scopus)

Abstract

Robot imitation is a useful and promising alternative to robot programming. Robot imitation involves two crucial issues. The first is how a robot can imitate a human whose physical structure and properties differ greatly from its own. The second is how the robot can generate various motions from finite programmable patterns (generalization). This paper describes a novel approach to robot imitation based on its own physical experiences. Let us consider a target task of moving an object on a table. For imitation, we focused on an active sensing process in which the robot acquires the relation between the object's motion and its own arm motion. For generalization, we applied a recurrent neural network with parametric bias (RNNPB) model to enable recognition/generation of imitation motions. The robot associates the arm motion which reproduces the observed object's motion presented by a human operator. Experimental results demonstrated that our method enabled the robot to imitate not only motion it has experienced but also unknown motion, which proved its capability for generalization.

Original languageEnglish
Title of host publicationIEEE International Conference on Intelligent Robots and Systems
Pages3669-3674
Number of pages6
DOIs
Publication statusPublished - 2006
Externally publishedYes
Event2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2006 - Beijing
Duration: 2006 Oct 92006 Oct 15

Other

Other2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2006
CityBeijing
Period06/10/906/10/15

Fingerprint

Recurrent neural networks
Robots
Robot programming

ASJC Scopus subject areas

  • Control and Systems Engineering

Cite this

Yokoya, R., Ogata, T., Tani, J., Komatani, K., & Okuno, H. G. (2006). Experience based imitation using RNNPB. In IEEE International Conference on Intelligent Robots and Systems (pp. 3669-3674). [4058974] https://doi.org/10.1109/IROS.2006.281724

Experience based imitation using RNNPB. / Yokoya, Ryunosuke; Ogata, Tetsuya; Tani, Jun; Komatani, Kazunori; Okuno, Hiroshi G.

IEEE International Conference on Intelligent Robots and Systems. 2006. p. 3669-3674 4058974.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Yokoya, R, Ogata, T, Tani, J, Komatani, K & Okuno, HG 2006, Experience based imitation using RNNPB. in IEEE International Conference on Intelligent Robots and Systems., 4058974, pp. 3669-3674, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2006, Beijing, 06/10/9. https://doi.org/10.1109/IROS.2006.281724
Yokoya R, Ogata T, Tani J, Komatani K, Okuno HG. Experience based imitation using RNNPB. In IEEE International Conference on Intelligent Robots and Systems. 2006. p. 3669-3674. 4058974 https://doi.org/10.1109/IROS.2006.281724
Yokoya, Ryunosuke ; Ogata, Tetsuya ; Tani, Jun ; Komatani, Kazunori ; Okuno, Hiroshi G. / Experience based imitation using RNNPB. IEEE International Conference on Intelligent Robots and Systems. 2006. pp. 3669-3674
@inproceedings{d73ba99f5edd4182bc8a91fd2260c913,
title = "Experience based imitation using RNNPB",
abstract = "Robot imitation is a useful and promising alternative to robot programming. Robot imitation involves two crucial issues. The first is how a robot can imitate a human whose physical structure and properties differ greatly from its own. The second is how the robot can generate various motions from finite programmable patterns (generalization). This paper describes a novel approach to robot imitation based on its own physical experiences. Let us consider a target task of moving an object on a table. For imitation, we focused on an active sensing process in which the robot acquires the relation between the object's motion and its own arm motion. For generalization, we applied a recurrent neural network with parametric bias (RNNPB) model to enable recognition/generation of imitation motions. The robot associates the arm motion which reproduces the observed object's motion presented by a human operator. Experimental results demonstrated that our method enabled the robot to imitate not only motion it has experienced but also unknown motion, which proved its capability for generalization.",
author = "Ryunosuke Yokoya and Tetsuya Ogata and Jun Tani and Kazunori Komatani and Okuno, {Hiroshi G.}",
year = "2006",
doi = "10.1109/IROS.2006.281724",
language = "English",
isbn = "142440259X",
pages = "3669--3674",
booktitle = "IEEE International Conference on Intelligent Robots and Systems",

}

TY - GEN

T1 - Experience based imitation using RNNPB

AU - Yokoya, Ryunosuke

AU - Ogata, Tetsuya

AU - Tani, Jun

AU - Komatani, Kazunori

AU - Okuno, Hiroshi G.

PY - 2006

Y1 - 2006

N2 - Robot imitation is a useful and promising alternative to robot programming. Robot imitation involves two crucial issues. The first is how a robot can imitate a human whose physical structure and properties differ greatly from its own. The second is how the robot can generate various motions from finite programmable patterns (generalization). This paper describes a novel approach to robot imitation based on its own physical experiences. Let us consider a target task of moving an object on a table. For imitation, we focused on an active sensing process in which the robot acquires the relation between the object's motion and its own arm motion. For generalization, we applied a recurrent neural network with parametric bias (RNNPB) model to enable recognition/generation of imitation motions. The robot associates the arm motion which reproduces the observed object's motion presented by a human operator. Experimental results demonstrated that our method enabled the robot to imitate not only motion it has experienced but also unknown motion, which proved its capability for generalization.

AB - Robot imitation is a useful and promising alternative to robot programming. Robot imitation involves two crucial issues. The first is how a robot can imitate a human whose physical structure and properties differ greatly from its own. The second is how the robot can generate various motions from finite programmable patterns (generalization). This paper describes a novel approach to robot imitation based on its own physical experiences. Let us consider a target task of moving an object on a table. For imitation, we focused on an active sensing process in which the robot acquires the relation between the object's motion and its own arm motion. For generalization, we applied a recurrent neural network with parametric bias (RNNPB) model to enable recognition/generation of imitation motions. The robot associates the arm motion which reproduces the observed object's motion presented by a human operator. Experimental results demonstrated that our method enabled the robot to imitate not only motion it has experienced but also unknown motion, which proved its capability for generalization.

UR - http://www.scopus.com/inward/record.url?scp=34250634325&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=34250634325&partnerID=8YFLogxK

U2 - 10.1109/IROS.2006.281724

DO - 10.1109/IROS.2006.281724

M3 - Conference contribution

AN - SCOPUS:34250634325

SN - 142440259X

SN - 9781424402595

SP - 3669

EP - 3674

BT - IEEE International Conference on Intelligent Robots and Systems

ER -