EmoHeart: Conveying emotions in second life based on affect sensing from text

Alena Neviarouskaya, Helmut Prendinger, Mitsuru Ishizuka

Research output: Contribution to journalArticle

26 Citations (Scopus)

Abstract

The 3D virtual world of Second Life imitates a form of real life by providing a space for rich interactions and social events. Second Life encourages people to establish or strengthen interpersonal relations, to share ideas, to gain new experiences, and to feel genuine emotions accompanying all adventures of virtual reality. Undoubtedly, emotions play a powerful role in communication. However, to trigger visual display of user's affective state in a virtual world, user has to manually assign appropriate facial expression or gesture to own avatar. Affect sensing from text, which enables automatic expression of emotions in the virtual environment, is a method to avoid manual control by the user and to enrich remote communications effortlessly. In this paper, we describe a lexical rule-based approach to recognition of emotions from text and an application of the developed Affect Analysis Model in Second Life. Based on the result of the Affect Analysis Model, the developed EmoHeart (object in Second Life) triggers animations of avatar facial expressions and visualizes emotion by heart-shaped textures.

Original languageEnglish
Article number209801
JournalAdvances in Human-Computer Interaction
Volume2010
DOIs
Publication statusPublished - 2010
Externally publishedYes

Fingerprint

Conveying
Virtual reality
Manual control
Communication
Animation
Textures
Display devices

ASJC Scopus subject areas

  • Human-Computer Interaction

Cite this

EmoHeart : Conveying emotions in second life based on affect sensing from text. / Neviarouskaya, Alena; Prendinger, Helmut; Ishizuka, Mitsuru.

In: Advances in Human-Computer Interaction, Vol. 2010, 209801, 2010.

Research output: Contribution to journalArticle

Neviarouskaya, Alena ; Prendinger, Helmut ; Ishizuka, Mitsuru. / EmoHeart : Conveying emotions in second life based on affect sensing from text. In: Advances in Human-Computer Interaction. 2010 ; Vol. 2010.
@article{debc54ffd53140ce97e3f78166752449,
title = "EmoHeart: Conveying emotions in second life based on affect sensing from text",
abstract = "The 3D virtual world of Second Life imitates a form of real life by providing a space for rich interactions and social events. Second Life encourages people to establish or strengthen interpersonal relations, to share ideas, to gain new experiences, and to feel genuine emotions accompanying all adventures of virtual reality. Undoubtedly, emotions play a powerful role in communication. However, to trigger visual display of user's affective state in a virtual world, user has to manually assign appropriate facial expression or gesture to own avatar. Affect sensing from text, which enables automatic expression of emotions in the virtual environment, is a method to avoid manual control by the user and to enrich remote communications effortlessly. In this paper, we describe a lexical rule-based approach to recognition of emotions from text and an application of the developed Affect Analysis Model in Second Life. Based on the result of the Affect Analysis Model, the developed EmoHeart (object in Second Life) triggers animations of avatar facial expressions and visualizes emotion by heart-shaped textures.",
author = "Alena Neviarouskaya and Helmut Prendinger and Mitsuru Ishizuka",
year = "2010",
doi = "10.1155/2010/209801",
language = "English",
volume = "2010",
journal = "Advances in Human-Computer Interaction",
issn = "1687-5893",
publisher = "Hindawi Publishing Corporation",

}

TY - JOUR

T1 - EmoHeart

T2 - Conveying emotions in second life based on affect sensing from text

AU - Neviarouskaya, Alena

AU - Prendinger, Helmut

AU - Ishizuka, Mitsuru

PY - 2010

Y1 - 2010

N2 - The 3D virtual world of Second Life imitates a form of real life by providing a space for rich interactions and social events. Second Life encourages people to establish or strengthen interpersonal relations, to share ideas, to gain new experiences, and to feel genuine emotions accompanying all adventures of virtual reality. Undoubtedly, emotions play a powerful role in communication. However, to trigger visual display of user's affective state in a virtual world, user has to manually assign appropriate facial expression or gesture to own avatar. Affect sensing from text, which enables automatic expression of emotions in the virtual environment, is a method to avoid manual control by the user and to enrich remote communications effortlessly. In this paper, we describe a lexical rule-based approach to recognition of emotions from text and an application of the developed Affect Analysis Model in Second Life. Based on the result of the Affect Analysis Model, the developed EmoHeart (object in Second Life) triggers animations of avatar facial expressions and visualizes emotion by heart-shaped textures.

AB - The 3D virtual world of Second Life imitates a form of real life by providing a space for rich interactions and social events. Second Life encourages people to establish or strengthen interpersonal relations, to share ideas, to gain new experiences, and to feel genuine emotions accompanying all adventures of virtual reality. Undoubtedly, emotions play a powerful role in communication. However, to trigger visual display of user's affective state in a virtual world, user has to manually assign appropriate facial expression or gesture to own avatar. Affect sensing from text, which enables automatic expression of emotions in the virtual environment, is a method to avoid manual control by the user and to enrich remote communications effortlessly. In this paper, we describe a lexical rule-based approach to recognition of emotions from text and an application of the developed Affect Analysis Model in Second Life. Based on the result of the Affect Analysis Model, the developed EmoHeart (object in Second Life) triggers animations of avatar facial expressions and visualizes emotion by heart-shaped textures.

UR - http://www.scopus.com/inward/record.url?scp=78349258931&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=78349258931&partnerID=8YFLogxK

U2 - 10.1155/2010/209801

DO - 10.1155/2010/209801

M3 - Article

AN - SCOPUS:78349258931

VL - 2010

JO - Advances in Human-Computer Interaction

JF - Advances in Human-Computer Interaction

SN - 1687-5893

M1 - 209801

ER -