Extracting facial motion parameters by tracking feature points

Takahiro Otsuka, Jun Ohya

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)

Abstract

A method for extracting facial motion parameters is pro- posed. The method consists of three steps. First, the feature points of the face, selected automatically in the first frame, are tracked in succes- sive frames. Then, the feature points are connected with Delaunay tri- angulation so that the motion of each point relative to the surrounding points can be computed. Finally, muscle motions are estimated based on motions of the feature points placed near each muscle. The experiments showed that the proposed method can extract facial motion parameters accurately. In addition, the facial motion parameters are used to render a facial animation sequence.

Original languageEnglish
Title of host publicationLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
PublisherSpringer Verlag
Pages433-444
Number of pages12
Volume1554
ISBN (Print)3540657622, 9783540657620
DOIs
Publication statusPublished - 1999
Externally publishedYes
Event1st International Conference on Advanced Multimedia Content Processing, AMCP 1998 - Osaka, Japan
Duration: 1998 Nov 91998 Nov 11

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume1554
ISSN (Print)03029743
ISSN (Electronic)16113349

Other

Other1st International Conference on Advanced Multimedia Content Processing, AMCP 1998
CountryJapan
CityOsaka
Period98/11/998/11/11

Fingerprint

Feature Point
Muscle
Motion
Triangulation
Animation
Facial Animation
Delaunay triangulation
Experiments
Face
Experiment

ASJC Scopus subject areas

  • Computer Science(all)
  • Theoretical Computer Science

Cite this

Otsuka, T., & Ohya, J. (1999). Extracting facial motion parameters by tracking feature points. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 1554, pp. 433-444). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 1554). Springer Verlag. https://doi.org/10.1007/3-540-48962-2_30

Extracting facial motion parameters by tracking feature points. / Otsuka, Takahiro; Ohya, Jun.

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Vol. 1554 Springer Verlag, 1999. p. 433-444 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 1554).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Otsuka, T & Ohya, J 1999, Extracting facial motion parameters by tracking feature points. in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). vol. 1554, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 1554, Springer Verlag, pp. 433-444, 1st International Conference on Advanced Multimedia Content Processing, AMCP 1998, Osaka, Japan, 98/11/9. https://doi.org/10.1007/3-540-48962-2_30
Otsuka T, Ohya J. Extracting facial motion parameters by tracking feature points. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Vol. 1554. Springer Verlag. 1999. p. 433-444. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). https://doi.org/10.1007/3-540-48962-2_30
Otsuka, Takahiro ; Ohya, Jun. / Extracting facial motion parameters by tracking feature points. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Vol. 1554 Springer Verlag, 1999. pp. 433-444 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
@inproceedings{ad4bc181fca041bf82a2febe457558dc,
title = "Extracting facial motion parameters by tracking feature points",
abstract = "A method for extracting facial motion parameters is pro- posed. The method consists of three steps. First, the feature points of the face, selected automatically in the first frame, are tracked in succes- sive frames. Then, the feature points are connected with Delaunay tri- angulation so that the motion of each point relative to the surrounding points can be computed. Finally, muscle motions are estimated based on motions of the feature points placed near each muscle. The experiments showed that the proposed method can extract facial motion parameters accurately. In addition, the facial motion parameters are used to render a facial animation sequence.",
author = "Takahiro Otsuka and Jun Ohya",
year = "1999",
doi = "10.1007/3-540-48962-2_30",
language = "English",
isbn = "3540657622",
volume = "1554",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
publisher = "Springer Verlag",
pages = "433--444",
booktitle = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",

}

TY - GEN

T1 - Extracting facial motion parameters by tracking feature points

AU - Otsuka, Takahiro

AU - Ohya, Jun

PY - 1999

Y1 - 1999

N2 - A method for extracting facial motion parameters is pro- posed. The method consists of three steps. First, the feature points of the face, selected automatically in the first frame, are tracked in succes- sive frames. Then, the feature points are connected with Delaunay tri- angulation so that the motion of each point relative to the surrounding points can be computed. Finally, muscle motions are estimated based on motions of the feature points placed near each muscle. The experiments showed that the proposed method can extract facial motion parameters accurately. In addition, the facial motion parameters are used to render a facial animation sequence.

AB - A method for extracting facial motion parameters is pro- posed. The method consists of three steps. First, the feature points of the face, selected automatically in the first frame, are tracked in succes- sive frames. Then, the feature points are connected with Delaunay tri- angulation so that the motion of each point relative to the surrounding points can be computed. Finally, muscle motions are estimated based on motions of the feature points placed near each muscle. The experiments showed that the proposed method can extract facial motion parameters accurately. In addition, the facial motion parameters are used to render a facial animation sequence.

UR - http://www.scopus.com/inward/record.url?scp=84958984737&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84958984737&partnerID=8YFLogxK

U2 - 10.1007/3-540-48962-2_30

DO - 10.1007/3-540-48962-2_30

M3 - Conference contribution

SN - 3540657622

SN - 9783540657620

VL - 1554

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 433

EP - 444

BT - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

PB - Springer Verlag

ER -