Empathic embodied interfaces: Addressing users' affective state

H. Prendinger, H. Dohi, H. Wang, S. Mayer, M. Ishizuka

Research output: Chapter in Book/Report/Conference proceedingConference contribution

34 Citations (Scopus)

Abstract

In this paper, we report on our efforts in developing affective character-based interfaces, i.e. interfaces that recognize and measure affective information of the user and address user affect by employing embodied characters. In particular, we describe the Empathic Companion, an animated interface agent that accompanies the user in the setting of a virtual job interview. This interface application takes physiological data (skin conductance and electromyography) of a user in real-time, interprets them as emotions, and addresses the user's affective states in the form of empathic feedback. We present preliminary results from an exploratory study that aims to evaluate the impact of the Empathic Companion by measuring users' skin conductance and heart rate.

Original languageEnglish
Title of host publicationLecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science)
EditorsE. Andre, L. Dybkjaer, W. Minker, P. Heisterkamp
Pages53-64
Number of pages12
Volume3068
Publication statusPublished - 2004
Externally publishedYes
EventTutorial and Research Workshop, ADS 2004 - Kloster Irsee, Germany
Duration: 2004 Jun 142004 Jun 16

Other

OtherTutorial and Research Workshop, ADS 2004
CountryGermany
CityKloster Irsee
Period04/6/1404/6/16

    Fingerprint

ASJC Scopus subject areas

  • Hardware and Architecture

Cite this

Prendinger, H., Dohi, H., Wang, H., Mayer, S., & Ishizuka, M. (2004). Empathic embodied interfaces: Addressing users' affective state. In E. Andre, L. Dybkjaer, W. Minker, & P. Heisterkamp (Eds.), Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science) (Vol. 3068, pp. 53-64)