Recognition of emotional states in spoken dialogue with a robot

Kazunori Komatani, Ryosuke Ito, Tatsuya Kawahara, Hiroshi G. Okuno

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

For flexible interactions between a robot and humans, we address the issue of automatic recognition of human emotions during the interaction such as embarrassment, pleasure, and affinity. To construct classifiers of emotions, we used the dialogue data between a humanoid robot, Robovie, and children, which was collected with the WOZ (Wizard of Oz) method. Besides prosodic features extracted from a single utterance, characteristics specific to dialogues such as utterance intervals and differences with previous utterances were also used. We used the SVM (Support Vector Machine) as a classifier to recognize two temporary emotions such as embarrassment or pleasure, and the decision tree learning algorithm, C5.0, as a classifier to recognize persistent emotion, i.e. affinity. The accuracy of classification was 79% for embarrassment, 74% for pleasure, and 87% for affinity. The humanoid Robovie in which this emotion classification module was implemented demonstrated adaptive behaviors based on the emotions it recognized.

Original languageEnglish
Title of host publicationLecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science)
EditorsB. Orchard, C. Yang, M. Ali
Pages413-423
Number of pages11
Volume3029
Publication statusPublished - 2004
Externally publishedYes
Event17th International Conference on Industrial and Engineering Applications of Artificial Intelligence and Expert Systems, IEA/AIE 2004 - Ottowa, Ont., Canada
Duration: 2004 May 172004 May 20

Other

Other17th International Conference on Industrial and Engineering Applications of Artificial Intelligence and Expert Systems, IEA/AIE 2004
CountryCanada
CityOttowa, Ont.
Period04/5/1704/5/20

Fingerprint

Classifiers
Robots
Decision trees
Learning algorithms
Support vector machines

ASJC Scopus subject areas

  • Hardware and Architecture

Cite this

Komatani, K., Ito, R., Kawahara, T., & Okuno, H. G. (2004). Recognition of emotional states in spoken dialogue with a robot. In B. Orchard, C. Yang, & M. Ali (Eds.), Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science) (Vol. 3029, pp. 413-423)

Recognition of emotional states in spoken dialogue with a robot. / Komatani, Kazunori; Ito, Ryosuke; Kawahara, Tatsuya; Okuno, Hiroshi G.

Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science). ed. / B. Orchard; C. Yang; M. Ali. Vol. 3029 2004. p. 413-423.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Komatani, K, Ito, R, Kawahara, T & Okuno, HG 2004, Recognition of emotional states in spoken dialogue with a robot. in B Orchard, C Yang & M Ali (eds), Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science). vol. 3029, pp. 413-423, 17th International Conference on Industrial and Engineering Applications of Artificial Intelligence and Expert Systems, IEA/AIE 2004, Ottowa, Ont., Canada, 04/5/17.
Komatani K, Ito R, Kawahara T, Okuno HG. Recognition of emotional states in spoken dialogue with a robot. In Orchard B, Yang C, Ali M, editors, Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science). Vol. 3029. 2004. p. 413-423
Komatani, Kazunori ; Ito, Ryosuke ; Kawahara, Tatsuya ; Okuno, Hiroshi G. / Recognition of emotional states in spoken dialogue with a robot. Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science). editor / B. Orchard ; C. Yang ; M. Ali. Vol. 3029 2004. pp. 413-423
@inproceedings{ec5b08057cb34b15984899b593098f21,
title = "Recognition of emotional states in spoken dialogue with a robot",
abstract = "For flexible interactions between a robot and humans, we address the issue of automatic recognition of human emotions during the interaction such as embarrassment, pleasure, and affinity. To construct classifiers of emotions, we used the dialogue data between a humanoid robot, Robovie, and children, which was collected with the WOZ (Wizard of Oz) method. Besides prosodic features extracted from a single utterance, characteristics specific to dialogues such as utterance intervals and differences with previous utterances were also used. We used the SVM (Support Vector Machine) as a classifier to recognize two temporary emotions such as embarrassment or pleasure, and the decision tree learning algorithm, C5.0, as a classifier to recognize persistent emotion, i.e. affinity. The accuracy of classification was 79{\%} for embarrassment, 74{\%} for pleasure, and 87{\%} for affinity. The humanoid Robovie in which this emotion classification module was implemented demonstrated adaptive behaviors based on the emotions it recognized.",
author = "Kazunori Komatani and Ryosuke Ito and Tatsuya Kawahara and Okuno, {Hiroshi G.}",
year = "2004",
language = "English",
volume = "3029",
pages = "413--423",
editor = "B. Orchard and C. Yang and M. Ali",
booktitle = "Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science)",

}

TY - GEN

T1 - Recognition of emotional states in spoken dialogue with a robot

AU - Komatani, Kazunori

AU - Ito, Ryosuke

AU - Kawahara, Tatsuya

AU - Okuno, Hiroshi G.

PY - 2004

Y1 - 2004

N2 - For flexible interactions between a robot and humans, we address the issue of automatic recognition of human emotions during the interaction such as embarrassment, pleasure, and affinity. To construct classifiers of emotions, we used the dialogue data between a humanoid robot, Robovie, and children, which was collected with the WOZ (Wizard of Oz) method. Besides prosodic features extracted from a single utterance, characteristics specific to dialogues such as utterance intervals and differences with previous utterances were also used. We used the SVM (Support Vector Machine) as a classifier to recognize two temporary emotions such as embarrassment or pleasure, and the decision tree learning algorithm, C5.0, as a classifier to recognize persistent emotion, i.e. affinity. The accuracy of classification was 79% for embarrassment, 74% for pleasure, and 87% for affinity. The humanoid Robovie in which this emotion classification module was implemented demonstrated adaptive behaviors based on the emotions it recognized.

AB - For flexible interactions between a robot and humans, we address the issue of automatic recognition of human emotions during the interaction such as embarrassment, pleasure, and affinity. To construct classifiers of emotions, we used the dialogue data between a humanoid robot, Robovie, and children, which was collected with the WOZ (Wizard of Oz) method. Besides prosodic features extracted from a single utterance, characteristics specific to dialogues such as utterance intervals and differences with previous utterances were also used. We used the SVM (Support Vector Machine) as a classifier to recognize two temporary emotions such as embarrassment or pleasure, and the decision tree learning algorithm, C5.0, as a classifier to recognize persistent emotion, i.e. affinity. The accuracy of classification was 79% for embarrassment, 74% for pleasure, and 87% for affinity. The humanoid Robovie in which this emotion classification module was implemented demonstrated adaptive behaviors based on the emotions it recognized.

UR - http://www.scopus.com/inward/record.url?scp=9444294052&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=9444294052&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:9444294052

VL - 3029

SP - 413

EP - 423

BT - Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science)

A2 - Orchard, B.

A2 - Yang, C.

A2 - Ali, M.

ER -