Automatic head-movement control for emotional speech

Shin Ichi Kawamoto, Tatsuo Yotsukura., Shigeo Morishima, Satoshi Nakamura

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Abstract

    Head movements could be automatically generated from speech data. The expression of head movement could be also controlled by user-defined emotional factors, as shown in the video demonstration.

    Original languageEnglish
    Title of host publicationACM SIGGRAPH 2005 Posters, SIGGRAPH 2005
    PublisherAssociation for Computing Machinery, Inc
    Pages28
    Number of pages1
    DOIs
    Publication statusPublished - 2005 Jul 31
    EventInternational Conference on Computer Graphics and Interactive Techniques, SIGGRAPH 2005 - Los Angeles, United States
    Duration: 2005 Jul 312005 Aug 4

    Other

    OtherInternational Conference on Computer Graphics and Interactive Techniques, SIGGRAPH 2005
    CountryUnited States
    CityLos Angeles
    Period05/7/3105/8/4

    ASJC Scopus subject areas

    • Software
    • Computer Graphics and Computer-Aided Design

    Fingerprint Dive into the research topics of 'Automatic head-movement control for emotional speech'. Together they form a unique fingerprint.

  • Cite this

    Kawamoto, S. I., Yotsukura., T., Morishima, S., & Nakamura, S. (2005). Automatic head-movement control for emotional speech. In ACM SIGGRAPH 2005 Posters, SIGGRAPH 2005 (pp. 28). Association for Computing Machinery, Inc. https://doi.org/10.1145/1186954.1186986