Real-time estimation of head motion using weak perspective epipolar geometry

T. Otsuka, Jun Ohya

Research output: Chapter in Book/Report/Conference proceedingConference contribution

7 Citations (Scopus)

Abstract

For face and facial expression recognition, it is necessary to estimate head motion in order to track a head continuously. This paper proposes a new method for estimating head motion using the epipolar geometry of a weak perspective projection model. In this method, first, the head region is segmented from the gradient of a luminance, by approximating the contour of the head as a circle. Then, feature points such as local extremum or saddle points of a luminance distribution are traded over successive frames. Finally, angles of rotation of the head during two successive frames are estimated from the coordinates of those feature points successfully tracked in these frames. Experiments were performed on a workstation in real time and the results showed that the method performs well in estimating head motion.

Original languageEnglish
Title of host publicationProceedings - 4th IEEE Workshop on Applications of Computer Vision, WACV 1998
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages220-225
Number of pages6
Volume1998-October
ISBN (Electronic)0818686065, 9780818686061
DOIs
Publication statusPublished - 1998 Jan 1
Externally publishedYes
Event4th IEEE Workshop on Applications of Computer Vision, WACV 1998 - Princeton, United States
Duration: 1998 Oct 191998 Oct 21

Other

Other4th IEEE Workshop on Applications of Computer Vision, WACV 1998
CountryUnited States
CityPrinceton
Period98/10/1998/10/21

Fingerprint

Luminance
Geometry
Experiments

ASJC Scopus subject areas

  • Computer Science Applications
  • Computer Vision and Pattern Recognition
  • Signal Processing

Cite this

Otsuka, T., & Ohya, J. (1998). Real-time estimation of head motion using weak perspective epipolar geometry. In Proceedings - 4th IEEE Workshop on Applications of Computer Vision, WACV 1998 (Vol. 1998-October, pp. 220-225). [732883] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ACV.1998.732883

Real-time estimation of head motion using weak perspective epipolar geometry. / Otsuka, T.; Ohya, Jun.

Proceedings - 4th IEEE Workshop on Applications of Computer Vision, WACV 1998. Vol. 1998-October Institute of Electrical and Electronics Engineers Inc., 1998. p. 220-225 732883.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Otsuka, T & Ohya, J 1998, Real-time estimation of head motion using weak perspective epipolar geometry. in Proceedings - 4th IEEE Workshop on Applications of Computer Vision, WACV 1998. vol. 1998-October, 732883, Institute of Electrical and Electronics Engineers Inc., pp. 220-225, 4th IEEE Workshop on Applications of Computer Vision, WACV 1998, Princeton, United States, 98/10/19. https://doi.org/10.1109/ACV.1998.732883
Otsuka T, Ohya J. Real-time estimation of head motion using weak perspective epipolar geometry. In Proceedings - 4th IEEE Workshop on Applications of Computer Vision, WACV 1998. Vol. 1998-October. Institute of Electrical and Electronics Engineers Inc. 1998. p. 220-225. 732883 https://doi.org/10.1109/ACV.1998.732883
Otsuka, T. ; Ohya, Jun. / Real-time estimation of head motion using weak perspective epipolar geometry. Proceedings - 4th IEEE Workshop on Applications of Computer Vision, WACV 1998. Vol. 1998-October Institute of Electrical and Electronics Engineers Inc., 1998. pp. 220-225
@inproceedings{f3d066e67eb64c7a8eecf527bad0c787,
title = "Real-time estimation of head motion using weak perspective epipolar geometry",
abstract = "For face and facial expression recognition, it is necessary to estimate head motion in order to track a head continuously. This paper proposes a new method for estimating head motion using the epipolar geometry of a weak perspective projection model. In this method, first, the head region is segmented from the gradient of a luminance, by approximating the contour of the head as a circle. Then, feature points such as local extremum or saddle points of a luminance distribution are traded over successive frames. Finally, angles of rotation of the head during two successive frames are estimated from the coordinates of those feature points successfully tracked in these frames. Experiments were performed on a workstation in real time and the results showed that the method performs well in estimating head motion.",
author = "T. Otsuka and Jun Ohya",
year = "1998",
month = "1",
day = "1",
doi = "10.1109/ACV.1998.732883",
language = "English",
volume = "1998-October",
pages = "220--225",
booktitle = "Proceedings - 4th IEEE Workshop on Applications of Computer Vision, WACV 1998",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

TY - GEN

T1 - Real-time estimation of head motion using weak perspective epipolar geometry

AU - Otsuka, T.

AU - Ohya, Jun

PY - 1998/1/1

Y1 - 1998/1/1

N2 - For face and facial expression recognition, it is necessary to estimate head motion in order to track a head continuously. This paper proposes a new method for estimating head motion using the epipolar geometry of a weak perspective projection model. In this method, first, the head region is segmented from the gradient of a luminance, by approximating the contour of the head as a circle. Then, feature points such as local extremum or saddle points of a luminance distribution are traded over successive frames. Finally, angles of rotation of the head during two successive frames are estimated from the coordinates of those feature points successfully tracked in these frames. Experiments were performed on a workstation in real time and the results showed that the method performs well in estimating head motion.

AB - For face and facial expression recognition, it is necessary to estimate head motion in order to track a head continuously. This paper proposes a new method for estimating head motion using the epipolar geometry of a weak perspective projection model. In this method, first, the head region is segmented from the gradient of a luminance, by approximating the contour of the head as a circle. Then, feature points such as local extremum or saddle points of a luminance distribution are traded over successive frames. Finally, angles of rotation of the head during two successive frames are estimated from the coordinates of those feature points successfully tracked in these frames. Experiments were performed on a workstation in real time and the results showed that the method performs well in estimating head motion.

UR - http://www.scopus.com/inward/record.url?scp=67650669640&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=67650669640&partnerID=8YFLogxK

U2 - 10.1109/ACV.1998.732883

DO - 10.1109/ACV.1998.732883

M3 - Conference contribution

VL - 1998-October

SP - 220

EP - 225

BT - Proceedings - 4th IEEE Workshop on Applications of Computer Vision, WACV 1998

PB - Institute of Electrical and Electronics Engineers Inc.

ER -