Real-time vision system for autonomous mobile robot

Masataka Doi, Manabu Nakakita, Yoshimitsu Aoki, Shuji Hashimoto

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    8 Citations (Scopus)

    Abstract

    In order to realize a vision system for an autonomous mobile robot used in a human living environment, it is necessary to observe human behaviors and to react to those actions. In this paper, we propose a real-time human tracking method based on vision system for an autonomous mobile robot. First, the system detects body parts as moving areas in the scene, and a face region or region specific to human is extracted in the detected area using color. Next, the facial gesture and head gesture are recognized. We implement the vision system on a mobile robot, and experimentally show the system can detect and track a human and his face in real-time.

    Original languageEnglish
    Title of host publicationRobot and Human Communication - Proceedings of the IEEE International Workshop
    Pages442-449
    Number of pages8
    Publication statusPublished - 2001
    Event10th IEEE International Workshop on Robot and Human Communication - Bordeaux-Paris
    Duration: 2001 Sep 182001 Sep 21

    Other

    Other10th IEEE International Workshop on Robot and Human Communication
    CityBordeaux-Paris
    Period01/9/1801/9/21

    Fingerprint

    Mobile robots
    Color

    ASJC Scopus subject areas

    • Hardware and Architecture
    • Software

    Cite this

    Doi, M., Nakakita, M., Aoki, Y., & Hashimoto, S. (2001). Real-time vision system for autonomous mobile robot. In Robot and Human Communication - Proceedings of the IEEE International Workshop (pp. 442-449)

    Real-time vision system for autonomous mobile robot. / Doi, Masataka; Nakakita, Manabu; Aoki, Yoshimitsu; Hashimoto, Shuji.

    Robot and Human Communication - Proceedings of the IEEE International Workshop. 2001. p. 442-449.

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Doi, M, Nakakita, M, Aoki, Y & Hashimoto, S 2001, Real-time vision system for autonomous mobile robot. in Robot and Human Communication - Proceedings of the IEEE International Workshop. pp. 442-449, 10th IEEE International Workshop on Robot and Human Communication, Bordeaux-Paris, 01/9/18.
    Doi M, Nakakita M, Aoki Y, Hashimoto S. Real-time vision system for autonomous mobile robot. In Robot and Human Communication - Proceedings of the IEEE International Workshop. 2001. p. 442-449
    Doi, Masataka ; Nakakita, Manabu ; Aoki, Yoshimitsu ; Hashimoto, Shuji. / Real-time vision system for autonomous mobile robot. Robot and Human Communication - Proceedings of the IEEE International Workshop. 2001. pp. 442-449
    @inproceedings{2ec0cb0573654a2cba82d995502d6b09,
    title = "Real-time vision system for autonomous mobile robot",
    abstract = "In order to realize a vision system for an autonomous mobile robot used in a human living environment, it is necessary to observe human behaviors and to react to those actions. In this paper, we propose a real-time human tracking method based on vision system for an autonomous mobile robot. First, the system detects body parts as moving areas in the scene, and a face region or region specific to human is extracted in the detected area using color. Next, the facial gesture and head gesture are recognized. We implement the vision system on a mobile robot, and experimentally show the system can detect and track a human and his face in real-time.",
    author = "Masataka Doi and Manabu Nakakita and Yoshimitsu Aoki and Shuji Hashimoto",
    year = "2001",
    language = "English",
    pages = "442--449",
    booktitle = "Robot and Human Communication - Proceedings of the IEEE International Workshop",

    }

    TY - GEN

    T1 - Real-time vision system for autonomous mobile robot

    AU - Doi, Masataka

    AU - Nakakita, Manabu

    AU - Aoki, Yoshimitsu

    AU - Hashimoto, Shuji

    PY - 2001

    Y1 - 2001

    N2 - In order to realize a vision system for an autonomous mobile robot used in a human living environment, it is necessary to observe human behaviors and to react to those actions. In this paper, we propose a real-time human tracking method based on vision system for an autonomous mobile robot. First, the system detects body parts as moving areas in the scene, and a face region or region specific to human is extracted in the detected area using color. Next, the facial gesture and head gesture are recognized. We implement the vision system on a mobile robot, and experimentally show the system can detect and track a human and his face in real-time.

    AB - In order to realize a vision system for an autonomous mobile robot used in a human living environment, it is necessary to observe human behaviors and to react to those actions. In this paper, we propose a real-time human tracking method based on vision system for an autonomous mobile robot. First, the system detects body parts as moving areas in the scene, and a face region or region specific to human is extracted in the detected area using color. Next, the facial gesture and head gesture are recognized. We implement the vision system on a mobile robot, and experimentally show the system can detect and track a human and his face in real-time.

    UR - http://www.scopus.com/inward/record.url?scp=0035719367&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=0035719367&partnerID=8YFLogxK

    M3 - Conference contribution

    AN - SCOPUS:0035719367

    SP - 442

    EP - 449

    BT - Robot and Human Communication - Proceedings of the IEEE International Workshop

    ER -