Embodied navigation for mobile robot by using direct 3D drawing in the air

Akihiro Osaki, Tetsuji Kaneko, Yoshiyuki Miwa

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    3 Citations (Scopus)

    Abstract

    This paper proposes a method to navigate a mobile robot in human environment via an interaction through virtual 3D lines, which are handwritten in the air by a robot user who coexists with the robot. The objective of this navigation is to develop the flexible instruction of the appropriate route for a mobile robot in adaptation of various situations easily by embodied interaction between robots and users. The developed system was also integrated with the aerial 3D drawing interface and control system of robots. Mobile robots and a drawing hand position are measured by 6DOF sensors simultaneously in real time. The 3D drawing interface allows a user not only to draw 3D lines in the air based on the sensed position in the real world, but also to push, grasp and throw a drawn line manually on site. Each mobile robots are distinguished by an association with a color information of each drawn line, and search the own line in real time individually. Therefore, in this method, it was achieved that the user individually sets, and selects a specific route and change the route by drawing or manipulating lines while the robots are moving. We conducted some navigation experiments using two omni-wheel mobile robots and examined the aforementioned navigation function. Additionally, we tried some advanced navigation that can be achieved only by drawing, such as the dog-walking by likening a virtual line to string. As a result this method enables a user to instruct some complex path easily and change the root flexibly with interaction through virtual lines in response to changing environments.

    Original languageEnglish
    Title of host publicationProceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN
    Pages671-676
    Number of pages6
    DOIs
    Publication statusPublished - 2008
    Event17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN - Munich
    Duration: 2008 Aug 12008 Aug 3

    Other

    Other17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN
    CityMunich
    Period08/8/108/8/3

    Fingerprint

    Mobile robots
    Navigation
    Robots
    Air
    End effectors
    Wheels
    Antennas
    Color
    Control systems
    Sensors
    Experiments

    ASJC Scopus subject areas

    • Artificial Intelligence
    • Computer Vision and Pattern Recognition
    • Human-Computer Interaction

    Cite this

    Osaki, A., Kaneko, T., & Miwa, Y. (2008). Embodied navigation for mobile robot by using direct 3D drawing in the air. In Proceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN (pp. 671-676). [4600744] https://doi.org/10.1109/ROMAN.2008.4600744

    Embodied navigation for mobile robot by using direct 3D drawing in the air. / Osaki, Akihiro; Kaneko, Tetsuji; Miwa, Yoshiyuki.

    Proceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN. 2008. p. 671-676 4600744.

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Osaki, A, Kaneko, T & Miwa, Y 2008, Embodied navigation for mobile robot by using direct 3D drawing in the air. in Proceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN., 4600744, pp. 671-676, 17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN, Munich, 08/8/1. https://doi.org/10.1109/ROMAN.2008.4600744
    Osaki A, Kaneko T, Miwa Y. Embodied navigation for mobile robot by using direct 3D drawing in the air. In Proceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN. 2008. p. 671-676. 4600744 https://doi.org/10.1109/ROMAN.2008.4600744
    Osaki, Akihiro ; Kaneko, Tetsuji ; Miwa, Yoshiyuki. / Embodied navigation for mobile robot by using direct 3D drawing in the air. Proceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN. 2008. pp. 671-676
    @inproceedings{6100022a095740a49d8bd62ee1dd87cd,
    title = "Embodied navigation for mobile robot by using direct 3D drawing in the air",
    abstract = "This paper proposes a method to navigate a mobile robot in human environment via an interaction through virtual 3D lines, which are handwritten in the air by a robot user who coexists with the robot. The objective of this navigation is to develop the flexible instruction of the appropriate route for a mobile robot in adaptation of various situations easily by embodied interaction between robots and users. The developed system was also integrated with the aerial 3D drawing interface and control system of robots. Mobile robots and a drawing hand position are measured by 6DOF sensors simultaneously in real time. The 3D drawing interface allows a user not only to draw 3D lines in the air based on the sensed position in the real world, but also to push, grasp and throw a drawn line manually on site. Each mobile robots are distinguished by an association with a color information of each drawn line, and search the own line in real time individually. Therefore, in this method, it was achieved that the user individually sets, and selects a specific route and change the route by drawing or manipulating lines while the robots are moving. We conducted some navigation experiments using two omni-wheel mobile robots and examined the aforementioned navigation function. Additionally, we tried some advanced navigation that can be achieved only by drawing, such as the dog-walking by likening a virtual line to string. As a result this method enables a user to instruct some complex path easily and change the root flexibly with interaction through virtual lines in response to changing environments.",
    author = "Akihiro Osaki and Tetsuji Kaneko and Yoshiyuki Miwa",
    year = "2008",
    doi = "10.1109/ROMAN.2008.4600744",
    language = "English",
    isbn = "9781424422135",
    pages = "671--676",
    booktitle = "Proceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN",

    }

    TY - GEN

    T1 - Embodied navigation for mobile robot by using direct 3D drawing in the air

    AU - Osaki, Akihiro

    AU - Kaneko, Tetsuji

    AU - Miwa, Yoshiyuki

    PY - 2008

    Y1 - 2008

    N2 - This paper proposes a method to navigate a mobile robot in human environment via an interaction through virtual 3D lines, which are handwritten in the air by a robot user who coexists with the robot. The objective of this navigation is to develop the flexible instruction of the appropriate route for a mobile robot in adaptation of various situations easily by embodied interaction between robots and users. The developed system was also integrated with the aerial 3D drawing interface and control system of robots. Mobile robots and a drawing hand position are measured by 6DOF sensors simultaneously in real time. The 3D drawing interface allows a user not only to draw 3D lines in the air based on the sensed position in the real world, but also to push, grasp and throw a drawn line manually on site. Each mobile robots are distinguished by an association with a color information of each drawn line, and search the own line in real time individually. Therefore, in this method, it was achieved that the user individually sets, and selects a specific route and change the route by drawing or manipulating lines while the robots are moving. We conducted some navigation experiments using two omni-wheel mobile robots and examined the aforementioned navigation function. Additionally, we tried some advanced navigation that can be achieved only by drawing, such as the dog-walking by likening a virtual line to string. As a result this method enables a user to instruct some complex path easily and change the root flexibly with interaction through virtual lines in response to changing environments.

    AB - This paper proposes a method to navigate a mobile robot in human environment via an interaction through virtual 3D lines, which are handwritten in the air by a robot user who coexists with the robot. The objective of this navigation is to develop the flexible instruction of the appropriate route for a mobile robot in adaptation of various situations easily by embodied interaction between robots and users. The developed system was also integrated with the aerial 3D drawing interface and control system of robots. Mobile robots and a drawing hand position are measured by 6DOF sensors simultaneously in real time. The 3D drawing interface allows a user not only to draw 3D lines in the air based on the sensed position in the real world, but also to push, grasp and throw a drawn line manually on site. Each mobile robots are distinguished by an association with a color information of each drawn line, and search the own line in real time individually. Therefore, in this method, it was achieved that the user individually sets, and selects a specific route and change the route by drawing or manipulating lines while the robots are moving. We conducted some navigation experiments using two omni-wheel mobile robots and examined the aforementioned navigation function. Additionally, we tried some advanced navigation that can be achieved only by drawing, such as the dog-walking by likening a virtual line to string. As a result this method enables a user to instruct some complex path easily and change the root flexibly with interaction through virtual lines in response to changing environments.

    UR - http://www.scopus.com/inward/record.url?scp=52949088670&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=52949088670&partnerID=8YFLogxK

    U2 - 10.1109/ROMAN.2008.4600744

    DO - 10.1109/ROMAN.2008.4600744

    M3 - Conference contribution

    AN - SCOPUS:52949088670

    SN - 9781424422135

    SP - 671

    EP - 676

    BT - Proceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN

    ER -