A conversation robot using head gesture recognition as para-linguistic information

Shinya Fujie, Yasushi Ejiri, Kei Nakajima, Yosuke Matsusaka, Tetsunori Kobayashi

Research output: Contribution to conferencePaper

43 Citations (Scopus)

Abstract

A conversation robot that recognizes user's head gestures and uses its results as para-linguistic information is developed. In the conversation, Humans exchange linguistic information, which can be obtained by transcription of the utterance, and para-linguistic information, which helps the transmission of linguistic information. Para-linguistic information brings a nuance that cannot be transmitted by linguistic information, and the natural and effective conversation is realized. In this paper, we recognize user's head gestures as the para-linguistic information in the visual channel. We use the optical flow over the head region as the feature and model them using HMM for the recognition. In actual conversation, while the user performs a gesture, the robot may perform a gesture, too. In this situation, the image sequence captured by the camera mounted on the eyes of the robot includes sways caused by the movement of the camera. To solve this problem, we introduced two artifices. One is for the feature extraction: the optical flow of the body area is used to compensate the swayed images. The other is for the probability models: mode-dependent models are prepared by the MLLR model adaptation technique, and the models are switched according to the motion mode of the robot. Experimental results show the effectiveness of these techniques.

Original languageEnglish
Pages159-164
Number of pages6
Publication statusPublished - 2004 Dec 1
EventRO-MAN 2004 - 13th IEEE International Workshop on Robot and Human Interactive Communication - Okayama, Japan
Duration: 2004 Sep 202004 Sep 22

Conference

ConferenceRO-MAN 2004 - 13th IEEE International Workshop on Robot and Human Interactive Communication
CountryJapan
CityOkayama
Period04/9/2004/9/22

ASJC Scopus subject areas

  • Engineering(all)

Fingerprint Dive into the research topics of 'A conversation robot using head gesture recognition as para-linguistic information'. Together they form a unique fingerprint.

  • Cite this

    Fujie, S., Ejiri, Y., Nakajima, K., Matsusaka, Y., & Kobayashi, T. (2004). A conversation robot using head gesture recognition as para-linguistic information. 159-164. Paper presented at RO-MAN 2004 - 13th IEEE International Workshop on Robot and Human Interactive Communication, Okayama, Japan.