Robust modeling of dynamic environment based on robot embodiment

Kuniaki Noda, Mototaka Suzuki, Naofumi Tsuchiya, Yuki Suga, Tetsuya Ogata, Shigeki Sugano

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    5 Citations (Scopus)

    Abstract

    Recent studies on embodied cognitive science have shown us the possibility of emergence of more complex and nontrivial behaviors with quite simple designs if the designer takes the dynamics of the system-environment interaction into account properly. In this paper, we report our tentative classification experiments of several objects using the human-like autonomous robot, "WAMOEBA-2Ri". As modeling the environment, we focus on not only static aspects of the environment but also dynamic aspects of it including that of the system own. The visualized results of this experiment shows the integration of multimodal sensor dataset acquired by the system-environment interaction ("grasping") enable robust categorization of several objects. Finally, in discussion, we demonstrate a possible application to making "invariance in motion" emerge consequently by extending this approach.

    Original languageEnglish
    Title of host publicationProceedings - IEEE International Conference on Robotics and Automation
    Pages3565-3570
    Number of pages6
    Volume3
    Publication statusPublished - 2003
    Event2003 IEEE International Conference on Robotics and Automation - Taipei, Taiwan, Province of China
    Duration: 2003 Sep 142003 Sep 19

    Other

    Other2003 IEEE International Conference on Robotics and Automation
    CountryTaiwan, Province of China
    CityTaipei
    Period03/9/1403/9/19

    Fingerprint

    Robots
    Invariance
    Experiments
    Sensors

    ASJC Scopus subject areas

    • Software
    • Control and Systems Engineering

    Cite this

    Noda, K., Suzuki, M., Tsuchiya, N., Suga, Y., Ogata, T., & Sugano, S. (2003). Robust modeling of dynamic environment based on robot embodiment. In Proceedings - IEEE International Conference on Robotics and Automation (Vol. 3, pp. 3565-3570)

    Robust modeling of dynamic environment based on robot embodiment. / Noda, Kuniaki; Suzuki, Mototaka; Tsuchiya, Naofumi; Suga, Yuki; Ogata, Tetsuya; Sugano, Shigeki.

    Proceedings - IEEE International Conference on Robotics and Automation. Vol. 3 2003. p. 3565-3570.

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Noda, K, Suzuki, M, Tsuchiya, N, Suga, Y, Ogata, T & Sugano, S 2003, Robust modeling of dynamic environment based on robot embodiment. in Proceedings - IEEE International Conference on Robotics and Automation. vol. 3, pp. 3565-3570, 2003 IEEE International Conference on Robotics and Automation, Taipei, Taiwan, Province of China, 03/9/14.
    Noda K, Suzuki M, Tsuchiya N, Suga Y, Ogata T, Sugano S. Robust modeling of dynamic environment based on robot embodiment. In Proceedings - IEEE International Conference on Robotics and Automation. Vol. 3. 2003. p. 3565-3570
    Noda, Kuniaki ; Suzuki, Mototaka ; Tsuchiya, Naofumi ; Suga, Yuki ; Ogata, Tetsuya ; Sugano, Shigeki. / Robust modeling of dynamic environment based on robot embodiment. Proceedings - IEEE International Conference on Robotics and Automation. Vol. 3 2003. pp. 3565-3570
    @inproceedings{d5f5f985311b49eba7c282d0a13b9a3d,
    title = "Robust modeling of dynamic environment based on robot embodiment",
    abstract = "Recent studies on embodied cognitive science have shown us the possibility of emergence of more complex and nontrivial behaviors with quite simple designs if the designer takes the dynamics of the system-environment interaction into account properly. In this paper, we report our tentative classification experiments of several objects using the human-like autonomous robot, {"}WAMOEBA-2Ri{"}. As modeling the environment, we focus on not only static aspects of the environment but also dynamic aspects of it including that of the system own. The visualized results of this experiment shows the integration of multimodal sensor dataset acquired by the system-environment interaction ({"}grasping{"}) enable robust categorization of several objects. Finally, in discussion, we demonstrate a possible application to making {"}invariance in motion{"} emerge consequently by extending this approach.",
    author = "Kuniaki Noda and Mototaka Suzuki and Naofumi Tsuchiya and Yuki Suga and Tetsuya Ogata and Shigeki Sugano",
    year = "2003",
    language = "English",
    volume = "3",
    pages = "3565--3570",
    booktitle = "Proceedings - IEEE International Conference on Robotics and Automation",

    }

    TY - GEN

    T1 - Robust modeling of dynamic environment based on robot embodiment

    AU - Noda, Kuniaki

    AU - Suzuki, Mototaka

    AU - Tsuchiya, Naofumi

    AU - Suga, Yuki

    AU - Ogata, Tetsuya

    AU - Sugano, Shigeki

    PY - 2003

    Y1 - 2003

    N2 - Recent studies on embodied cognitive science have shown us the possibility of emergence of more complex and nontrivial behaviors with quite simple designs if the designer takes the dynamics of the system-environment interaction into account properly. In this paper, we report our tentative classification experiments of several objects using the human-like autonomous robot, "WAMOEBA-2Ri". As modeling the environment, we focus on not only static aspects of the environment but also dynamic aspects of it including that of the system own. The visualized results of this experiment shows the integration of multimodal sensor dataset acquired by the system-environment interaction ("grasping") enable robust categorization of several objects. Finally, in discussion, we demonstrate a possible application to making "invariance in motion" emerge consequently by extending this approach.

    AB - Recent studies on embodied cognitive science have shown us the possibility of emergence of more complex and nontrivial behaviors with quite simple designs if the designer takes the dynamics of the system-environment interaction into account properly. In this paper, we report our tentative classification experiments of several objects using the human-like autonomous robot, "WAMOEBA-2Ri". As modeling the environment, we focus on not only static aspects of the environment but also dynamic aspects of it including that of the system own. The visualized results of this experiment shows the integration of multimodal sensor dataset acquired by the system-environment interaction ("grasping") enable robust categorization of several objects. Finally, in discussion, we demonstrate a possible application to making "invariance in motion" emerge consequently by extending this approach.

    UR - http://www.scopus.com/inward/record.url?scp=0344464881&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=0344464881&partnerID=8YFLogxK

    M3 - Conference contribution

    VL - 3

    SP - 3565

    EP - 3570

    BT - Proceedings - IEEE International Conference on Robotics and Automation

    ER -