EmoHeart: Conveying emotions in second life based on affect sensing from text

Alena Neviarouskaya, Helmut Prendinger, Mitsuru Ishizuka

Research output: Contribution to journalArticle

27 Citations (Scopus)

Abstract

The 3D virtual world of Second Life imitates a form of real life by providing a space for rich interactions and social events. Second Life encourages people to establish or strengthen interpersonal relations, to share ideas, to gain new experiences, and to feel genuine emotions accompanying all adventures of virtual reality. Undoubtedly, emotions play a powerful role in communication. However, to trigger visual display of user's affective state in a virtual world, user has to manually assign appropriate facial expression or gesture to own avatar. Affect sensing from text, which enables automatic expression of emotions in the virtual environment, is a method to avoid manual control by the user and to enrich remote communications effortlessly. In this paper, we describe a lexical rule-based approach to recognition of emotions from text and an application of the developed Affect Analysis Model in Second Life. Based on the result of the Affect Analysis Model, the developed EmoHeart (object in Second Life) triggers animations of avatar facial expressions and visualizes emotion by heart-shaped textures.

Original languageEnglish
Article number209801
JournalAdvances in Human-Computer Interaction
Volume2010
DOIs
Publication statusPublished - 2010
Externally publishedYes

ASJC Scopus subject areas

  • Human-Computer Interaction

Fingerprint Dive into the research topics of 'EmoHeart: Conveying emotions in second life based on affect sensing from text'. Together they form a unique fingerprint.

  • Cite this