Real-time vision system for autonomous mobile robot

Masataka Doi, Manabu Nakakita, Yoshimitsu Aoki, Shuji Hashimoto

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    4 Citations (Scopus)

    Abstract

    In order to realize a vision system for an autonomous mobile robot used in a human living environment, it is necessary to observe human behaviors and to react to those actions. In this paper, we propose a real-time human tracking method based on vision system for an autonomous mobile robot. First, the system detects body parts as moving areas in the scene, and a face region or region specific to human is extracted in the detected area using color. Next, the facial gesture and head gesture are recognized. We implement the vision system on a mobile robot, and experimentally show the system can detect and track a human and his face in real-time.

    Original languageEnglish
    Title of host publicationProceedings - IEEE International Workshop on Robot and Human Interactive Communication
    Pages442-449
    Number of pages8
    DOIs
    Publication statusPublished - 2001
    Event10th IEEE International Workshop on Robot and Human Interactive Communication, ROMAN 2001 - Bordeaux and Paris
    Duration: 2001 Sep 182001 Sep 21

    Other

    Other10th IEEE International Workshop on Robot and Human Interactive Communication, ROMAN 2001
    CityBordeaux and Paris
    Period01/9/1801/9/21

      Fingerprint

    ASJC Scopus subject areas

    • Software
    • Artificial Intelligence
    • Human-Computer Interaction

    Cite this

    Doi, M., Nakakita, M., Aoki, Y., & Hashimoto, S. (2001). Real-time vision system for autonomous mobile robot. In Proceedings - IEEE International Workshop on Robot and Human Interactive Communication (pp. 442-449). [981944] https://doi.org/10.1109/ROMAN.2001.981944