Symmetric multimodality revisited: Unveiling users' physiological activity

Helmut Prendinger*, Mitsuru Ishizuka

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

12 Citations (Scopus)

Abstract

In this paper, we describe our own stance on a research area called "Humatronics,"which aims at establishing a (more) symmetric interaction relationship between humans and computer systems. In particular, we will advocate a novel approach to understanding humans that is based on largely involuntary and unconscious physiological information and gaze behavior rather than purposeful and conscious actions or behaviors. "Understanding humans" here refers to users' states related to emotion and affect, attention and interest, and possibly even to their intentions. A key feature of our approach is that it provides insight into a person's cognitive-motivational state without relying on cognitive judgements, such as answers to dedicated queries. Lifelike interface agents are endowed with synthetic bodies and faces and can be considered as prime candidates for outbalancing the asymmetric relationship in current human-computer interaction. As example applications, we will report on two recent studies that utilized lifelike agents as presenters or interaction partners of users. The resulting interactions can be conceived as implementing initial steps toward symmetric multimodality in user interfaces.

Original languageEnglish
Pages (from-to)692-698
Number of pages7
JournalIEEE Transactions on Industrial Electronics
Volume54
Issue number2
DOIs
Publication statusPublished - 2007 Apr
Externally publishedYes

Keywords

  • User interface human factors

ASJC Scopus subject areas

  • Electrical and Electronic Engineering
  • Instrumentation

Fingerprint

Dive into the research topics of 'Symmetric multimodality revisited: Unveiling users' physiological activity'. Together they form a unique fingerprint.

Cite this