Incremental Development of Multiple Tool Models for Robotic Reaching Through Autonomous Exploration

Lorenzo Jamone*, Bruno Damas, Nobotsuna Endo, José Santos-Victor, Atsuo Takanishi

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

11 Citations (Scopus)


Autonomy and flexibility are two major requirements for modern robots. In particular, humanoid robots should learn new skills incrementally through autonomous exploration, and adapt to different contexts. In this paper we consider the problem of learning forward models for task space control under dynamically varying kinematic contexts: the robot learns incrementally and autonomously its forward kinematics under different contexts, represented by the inclusion of different tools, and exploits the learned model to realize reaching with those tools. We model the forward kinematics as a multi-valued function, in which different outputs for the same input query are related to different tools (i.e. contexts). The model is estimated using IMLE, a recent online learning algorithm for multi-valued regression, and used for control. No information is given about the tool changes, nor any assumption is made about the tool kinematics. Results are provided both in simulation and with a full-body humanoid. In the latter case we show how the robot successfully performs reaching using a flexible tool, a clear example of complex kinematics.

Original languageEnglish
Pages (from-to)113-127
Number of pages15
Issue number3
Publication statusPublished - 2012 Sep 1


  • continuous online learning
  • developmental robotics
  • humanoid robots
  • motor learning and adaptation
  • reaching with tools

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Developmental Neuroscience
  • Cognitive Neuroscience
  • Artificial Intelligence
  • Behavioral Neuroscience


Dive into the research topics of 'Incremental Development of Multiple Tool Models for Robotic Reaching Through Autonomous Exploration'. Together they form a unique fingerprint.

Cite this