Real-time vision system for autonomous mobile robot

Masataka Doi, Manabu Nakakita, Yoshimitsu Aoki, Shuji Hashimoto

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    8 Citations (Scopus)

    Abstract

    In order to realize a vision system for an autonomous mobile robot used in a human living environment, it is necessary to observe human behaviors and to react to those actions. In this paper, we propose a real-time human tracking method based on vision system for an autonomous mobile robot. First, the system detects body parts as moving areas in the scene, and a face region or region specific to human is extracted in the detected area using color. Next, the facial gesture and head gesture are recognized. We implement the vision system on a mobile robot, and experimentally show the system can detect and track a human and his face in real-time.

    Original languageEnglish
    Title of host publicationRobot and Human Communication - Proceedings of the IEEE International Workshop
    Pages442-449
    Number of pages8
    Publication statusPublished - 2001
    Event10th IEEE International Workshop on Robot and Human Communication - Bordeaux-Paris
    Duration: 2001 Sep 182001 Sep 21

    Other

    Other10th IEEE International Workshop on Robot and Human Communication
    CityBordeaux-Paris
    Period01/9/1801/9/21

    ASJC Scopus subject areas

    • Hardware and Architecture
    • Software

    Fingerprint Dive into the research topics of 'Real-time vision system for autonomous mobile robot'. Together they form a unique fingerprint.

  • Cite this

    Doi, M., Nakakita, M., Aoki, Y., & Hashimoto, S. (2001). Real-time vision system for autonomous mobile robot. In Robot and Human Communication - Proceedings of the IEEE International Workshop (pp. 442-449)