Self-organization of dynamic object features based on bidirectional training

Shun Nishide, Tetsuya Ogata, Jun Tani, Kazunori Komatani, Hiroshi G. Okuno

Research output: Contribution to journalArticle

3 Citations (Scopus)

Abstract

This paper presents a method to self-organize object features that describe object dynamics using bidirectional training. The model is composed of a dynamics learning module and a feature extraction module. Recurrent Neural Network with Parametric Bias (RNNPB) is utilized for the dynamics learning module, learning and self-organizing the sequences of robot and object motions. A hierarchical neural network is linked to the input of RNNPB as the feature extraction module for self-organizing object features that describe the object motions. The two modules are simultaneously trained through bidirectional training using image and motion sequences acquired from the robot's active sensing with objects. Experiments are performed with the robot's pushing motion with a variety of objects to generate sliding, falling over, bouncing and rolling motions. The results have shown that the model is capable of self-organizing object dynamics based on the self-organized features.

Original languageEnglish
Pages (from-to)2035-2057
Number of pages23
JournalAdvanced Robotics
Volume23
Issue number15
DOIs
Publication statusPublished - 2009 Oct 1

    Fingerprint

Keywords

  • Active sensing
  • Affordance
  • Bidirectional training
  • Humanoid robots
  • Neural networks

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Human-Computer Interaction
  • Hardware and Architecture
  • Computer Science Applications

Cite this