Real-time moving human face synthesis using a parallel computer network

Osamu Hasegawa, Wiwat Wongwarawipat, Chil Woo Lee, Mitsuru Ishizuka

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

The authors describe a real-time image synthesis method for a moving natural human face. In the prototype system, a parallel computer with a special high-speed visual data bus named VIT was used. For image synthesis, 16 VITs were used in the system. A texture mapping method was adopted to synthesize natural images. Natural moving human face images and emotional expressions can be synthesized at a maximum of 25 frames/s using the system. Some samples of synthesized images are shown. The hardware and software are described.

Original languageEnglish
Title of host publicationIECON Proceedings (Industrial Electronics Conference)
Place of PublicationLos Alamitos, CA, United States
PublisherPubl by IEEE
Pages1380-1385
Number of pages6
Volume2
Publication statusPublished - 1991
Externally publishedYes
EventProceedings of the 1991 International Conference on Industrial Electronics, Control and Instrumentation - IECON '91 - Kobe, Jpn
Duration: 1991 Oct 281991 Nov 1

Other

OtherProceedings of the 1991 International Conference on Industrial Electronics, Control and Instrumentation - IECON '91
CityKobe, Jpn
Period91/10/2891/11/1

    Fingerprint

ASJC Scopus subject areas

  • Electrical and Electronic Engineering

Cite this

Hasegawa, O., Wongwarawipat, W., Lee, C. W., & Ishizuka, M. (1991). Real-time moving human face synthesis using a parallel computer network. In IECON Proceedings (Industrial Electronics Conference) (Vol. 2, pp. 1380-1385). Publ by IEEE.