Internet communication using real-time facial expression analysis and synthesis

Naiwala P. Chandrasiri, Takeshi Naemura, Mitsuru Ishizuka, Hiroshi Harashima, István Barakonyi

Research output: Contribution to journalArticle

8 Citations (Scopus)

Abstract

A system that animates 3D facial agents based on real-time facial expression analysis techniques and research or synthesizing facial expressions and text-to-speech capabilities is now available. The system consists of three main modules, including, a real-time facial expression analysis compoent that calculates the MPEG-4 FAP2, an effective 3D agent with facial expression synthesis and text-to-speech capabilities, and a communication module. Subjective evaluations involving graduate and undergraduate students confirm the communication system's effectiveness. Potential applications include virtual teleconferencing, entertainment, computer games, human-to-human communication training, and distance learning.

Original languageEnglish
Pages (from-to)20-29
Number of pages10
JournalIEEE Multimedia
Volume11
Issue number3
DOIs
Publication statusPublished - 2004 Jul
Externally publishedYes

Fingerprint

Facial Expression
Internet
Synthesis
Teleconferencing
Real-time
Text-to-speech
Computer games
Distance education
Communication
Communication systems
Students
Distance Learning
Module
MPEG-4
Subjective Evaluation
Computer Games
Communication Systems
Calculate
Human

ASJC Scopus subject areas

  • Hardware and Architecture
  • Information Systems
  • Computer Graphics and Computer-Aided Design
  • Software
  • Theoretical Computer Science
  • Computational Theory and Mathematics

Cite this

Chandrasiri, N. P., Naemura, T., Ishizuka, M., Harashima, H., & Barakonyi, I. (2004). Internet communication using real-time facial expression analysis and synthesis. IEEE Multimedia, 11(3), 20-29. https://doi.org/10.1109/MMUL.2004.10

Internet communication using real-time facial expression analysis and synthesis. / Chandrasiri, Naiwala P.; Naemura, Takeshi; Ishizuka, Mitsuru; Harashima, Hiroshi; Barakonyi, István.

In: IEEE Multimedia, Vol. 11, No. 3, 07.2004, p. 20-29.

Research output: Contribution to journalArticle

Chandrasiri, NP, Naemura, T, Ishizuka, M, Harashima, H & Barakonyi, I 2004, 'Internet communication using real-time facial expression analysis and synthesis', IEEE Multimedia, vol. 11, no. 3, pp. 20-29. https://doi.org/10.1109/MMUL.2004.10
Chandrasiri NP, Naemura T, Ishizuka M, Harashima H, Barakonyi I. Internet communication using real-time facial expression analysis and synthesis. IEEE Multimedia. 2004 Jul;11(3):20-29. https://doi.org/10.1109/MMUL.2004.10
Chandrasiri, Naiwala P. ; Naemura, Takeshi ; Ishizuka, Mitsuru ; Harashima, Hiroshi ; Barakonyi, István. / Internet communication using real-time facial expression analysis and synthesis. In: IEEE Multimedia. 2004 ; Vol. 11, No. 3. pp. 20-29.
@article{076794d2cd2a4f80b4a1f91649d36218,
title = "Internet communication using real-time facial expression analysis and synthesis",
abstract = "A system that animates 3D facial agents based on real-time facial expression analysis techniques and research or synthesizing facial expressions and text-to-speech capabilities is now available. The system consists of three main modules, including, a real-time facial expression analysis compoent that calculates the MPEG-4 FAP2, an effective 3D agent with facial expression synthesis and text-to-speech capabilities, and a communication module. Subjective evaluations involving graduate and undergraduate students confirm the communication system's effectiveness. Potential applications include virtual teleconferencing, entertainment, computer games, human-to-human communication training, and distance learning.",
author = "Chandrasiri, {Naiwala P.} and Takeshi Naemura and Mitsuru Ishizuka and Hiroshi Harashima and Istv{\'a}n Barakonyi",
year = "2004",
month = "7",
doi = "10.1109/MMUL.2004.10",
language = "English",
volume = "11",
pages = "20--29",
journal = "IEEE Multimedia",
issn = "1070-986X",
publisher = "IEEE Computer Society",
number = "3",

}

TY - JOUR

T1 - Internet communication using real-time facial expression analysis and synthesis

AU - Chandrasiri, Naiwala P.

AU - Naemura, Takeshi

AU - Ishizuka, Mitsuru

AU - Harashima, Hiroshi

AU - Barakonyi, István

PY - 2004/7

Y1 - 2004/7

N2 - A system that animates 3D facial agents based on real-time facial expression analysis techniques and research or synthesizing facial expressions and text-to-speech capabilities is now available. The system consists of three main modules, including, a real-time facial expression analysis compoent that calculates the MPEG-4 FAP2, an effective 3D agent with facial expression synthesis and text-to-speech capabilities, and a communication module. Subjective evaluations involving graduate and undergraduate students confirm the communication system's effectiveness. Potential applications include virtual teleconferencing, entertainment, computer games, human-to-human communication training, and distance learning.

AB - A system that animates 3D facial agents based on real-time facial expression analysis techniques and research or synthesizing facial expressions and text-to-speech capabilities is now available. The system consists of three main modules, including, a real-time facial expression analysis compoent that calculates the MPEG-4 FAP2, an effective 3D agent with facial expression synthesis and text-to-speech capabilities, and a communication module. Subjective evaluations involving graduate and undergraduate students confirm the communication system's effectiveness. Potential applications include virtual teleconferencing, entertainment, computer games, human-to-human communication training, and distance learning.

UR - http://www.scopus.com/inward/record.url?scp=4344602131&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=4344602131&partnerID=8YFLogxK

U2 - 10.1109/MMUL.2004.10

DO - 10.1109/MMUL.2004.10

M3 - Article

AN - SCOPUS:4344602131

VL - 11

SP - 20

EP - 29

JO - IEEE Multimedia

JF - IEEE Multimedia

SN - 1070-986X

IS - 3

ER -