Tool-body assimilation model considering grasping motion through deep learning

Kuniyuki Takahashi, Kitae Kim, Tetsuya Ogata, Shigeki Sugano

    Research output: Contribution to journalArticle

    9 Citations (Scopus)

    Abstract

    We propose a tool-body assimilation model that considers grasping during motor babbling for using tools. A robot with tool-use skills can be useful in human–robot symbiosis because this allows the robot to expand its task performing abilities. Past studies that included tool-body assimilation approaches were mainly focused on obtaining the functions of the tools, and demonstrated the robot starting its motions with a tool pre-attached to the robot. This implies that the robot would not be able to decide whether and where to grasp the tool. In real life environments, robots would need to consider the possibilities of tool-grasping positions, and then grasp the tool. To address these issues, the robot performs motor babbling by grasping and nongrasping the tools to learn the robot's body model and tool functions. In addition, the robot grasps various parts of the tools to learn different tool functions from different grasping positions. The motion experiences are learned using deep learning. In model evaluation, the robot manipulates an object task without tools, and with several tools of different shapes. The robot generates motions after being shown the initial state and a target image, by deciding whether and where to grasp the tool. Therefore, the robot is capable of generating the correct motion and grasping decision when the initial state and a target image are provided to the robot.

    Original languageEnglish
    Pages (from-to)115-127
    Number of pages13
    JournalRobotics and Autonomous Systems
    Volume91
    DOIs
    Publication statusPublished - 2017 May 1

    Fingerprint

    Grasping
    Robot
    Motion
    Robots
    Model
    Learning
    Deep learning
    Symbiosis
    Model Evaluation
    Target

    Keywords

    • Deep neural network
    • Motor babbling
    • Recurrent neural network
    • Tool-body assimilation
    • Transfer learning

    ASJC Scopus subject areas

    • Control and Systems Engineering
    • Software
    • Mathematics(all)
    • Computer Science Applications

    Cite this

    Tool-body assimilation model considering grasping motion through deep learning. / Takahashi, Kuniyuki; Kim, Kitae; Ogata, Tetsuya; Sugano, Shigeki.

    In: Robotics and Autonomous Systems, Vol. 91, 01.05.2017, p. 115-127.

    Research output: Contribution to journalArticle

    @article{6ef48db06cd9440bafdf2b2903e9b450,
    title = "Tool-body assimilation model considering grasping motion through deep learning",
    abstract = "We propose a tool-body assimilation model that considers grasping during motor babbling for using tools. A robot with tool-use skills can be useful in human–robot symbiosis because this allows the robot to expand its task performing abilities. Past studies that included tool-body assimilation approaches were mainly focused on obtaining the functions of the tools, and demonstrated the robot starting its motions with a tool pre-attached to the robot. This implies that the robot would not be able to decide whether and where to grasp the tool. In real life environments, robots would need to consider the possibilities of tool-grasping positions, and then grasp the tool. To address these issues, the robot performs motor babbling by grasping and nongrasping the tools to learn the robot's body model and tool functions. In addition, the robot grasps various parts of the tools to learn different tool functions from different grasping positions. The motion experiences are learned using deep learning. In model evaluation, the robot manipulates an object task without tools, and with several tools of different shapes. The robot generates motions after being shown the initial state and a target image, by deciding whether and where to grasp the tool. Therefore, the robot is capable of generating the correct motion and grasping decision when the initial state and a target image are provided to the robot.",
    keywords = "Deep neural network, Motor babbling, Recurrent neural network, Tool-body assimilation, Transfer learning",
    author = "Kuniyuki Takahashi and Kitae Kim and Tetsuya Ogata and Shigeki Sugano",
    year = "2017",
    month = "5",
    day = "1",
    doi = "10.1016/j.robot.2017.01.002",
    language = "English",
    volume = "91",
    pages = "115--127",
    journal = "Robotics and Autonomous Systems",
    issn = "0921-8890",
    publisher = "Elsevier",

    }

    TY - JOUR

    T1 - Tool-body assimilation model considering grasping motion through deep learning

    AU - Takahashi, Kuniyuki

    AU - Kim, Kitae

    AU - Ogata, Tetsuya

    AU - Sugano, Shigeki

    PY - 2017/5/1

    Y1 - 2017/5/1

    N2 - We propose a tool-body assimilation model that considers grasping during motor babbling for using tools. A robot with tool-use skills can be useful in human–robot symbiosis because this allows the robot to expand its task performing abilities. Past studies that included tool-body assimilation approaches were mainly focused on obtaining the functions of the tools, and demonstrated the robot starting its motions with a tool pre-attached to the robot. This implies that the robot would not be able to decide whether and where to grasp the tool. In real life environments, robots would need to consider the possibilities of tool-grasping positions, and then grasp the tool. To address these issues, the robot performs motor babbling by grasping and nongrasping the tools to learn the robot's body model and tool functions. In addition, the robot grasps various parts of the tools to learn different tool functions from different grasping positions. The motion experiences are learned using deep learning. In model evaluation, the robot manipulates an object task without tools, and with several tools of different shapes. The robot generates motions after being shown the initial state and a target image, by deciding whether and where to grasp the tool. Therefore, the robot is capable of generating the correct motion and grasping decision when the initial state and a target image are provided to the robot.

    AB - We propose a tool-body assimilation model that considers grasping during motor babbling for using tools. A robot with tool-use skills can be useful in human–robot symbiosis because this allows the robot to expand its task performing abilities. Past studies that included tool-body assimilation approaches were mainly focused on obtaining the functions of the tools, and demonstrated the robot starting its motions with a tool pre-attached to the robot. This implies that the robot would not be able to decide whether and where to grasp the tool. In real life environments, robots would need to consider the possibilities of tool-grasping positions, and then grasp the tool. To address these issues, the robot performs motor babbling by grasping and nongrasping the tools to learn the robot's body model and tool functions. In addition, the robot grasps various parts of the tools to learn different tool functions from different grasping positions. The motion experiences are learned using deep learning. In model evaluation, the robot manipulates an object task without tools, and with several tools of different shapes. The robot generates motions after being shown the initial state and a target image, by deciding whether and where to grasp the tool. Therefore, the robot is capable of generating the correct motion and grasping decision when the initial state and a target image are provided to the robot.

    KW - Deep neural network

    KW - Motor babbling

    KW - Recurrent neural network

    KW - Tool-body assimilation

    KW - Transfer learning

    UR - http://www.scopus.com/inward/record.url?scp=85015064478&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=85015064478&partnerID=8YFLogxK

    U2 - 10.1016/j.robot.2017.01.002

    DO - 10.1016/j.robot.2017.01.002

    M3 - Article

    AN - SCOPUS:85015064478

    VL - 91

    SP - 115

    EP - 127

    JO - Robotics and Autonomous Systems

    JF - Robotics and Autonomous Systems

    SN - 0921-8890

    ER -