Interrelating physical feature of facial expression and its impression

K. Suzuki, H. Yamada, S. Hashimoto

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    2 Citations (Scopus)

    Abstract

    The purpose of this work is to analyze human perception on the basis of a model of the interrelationship between physical feature of face and its emotional impression by using a unique neural network that is able to find mapping of non-linear relationship. Human's impressions of faces can be visualized on to the two or three dimensional semantic space that is nonlinearly mapped from facial physical parameters. Additionally, by obtaining the inverse nonlinear mapping, facial expressions can be reproduced from the semantic parameters. This approach does not only show good performance compared to the conventional statistical method, but also able to map a new data that are not used at the training phase to examine the appropriateness of the obtained mapping function.

    Original languageEnglish
    Title of host publicationProceedings of the International Joint Conference on Neural Networks
    Pages1864-1869
    Number of pages6
    Volume3
    Publication statusPublished - 2001
    EventInternational Joint Conference on Neural Networks (IJCNN'01) - Washington, DC
    Duration: 2001 Jul 152001 Jul 19

    Other

    OtherInternational Joint Conference on Neural Networks (IJCNN'01)
    CityWashington, DC
    Period01/7/1501/7/19

    Fingerprint

    Semantics
    Statistical methods
    Neural networks

    ASJC Scopus subject areas

    • Software

    Cite this

    Suzuki, K., Yamada, H., & Hashimoto, S. (2001). Interrelating physical feature of facial expression and its impression. In Proceedings of the International Joint Conference on Neural Networks (Vol. 3, pp. 1864-1869)

    Interrelating physical feature of facial expression and its impression. / Suzuki, K.; Yamada, H.; Hashimoto, S.

    Proceedings of the International Joint Conference on Neural Networks. Vol. 3 2001. p. 1864-1869.

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Suzuki, K, Yamada, H & Hashimoto, S 2001, Interrelating physical feature of facial expression and its impression. in Proceedings of the International Joint Conference on Neural Networks. vol. 3, pp. 1864-1869, International Joint Conference on Neural Networks (IJCNN'01), Washington, DC, 01/7/15.
    Suzuki K, Yamada H, Hashimoto S. Interrelating physical feature of facial expression and its impression. In Proceedings of the International Joint Conference on Neural Networks. Vol. 3. 2001. p. 1864-1869
    Suzuki, K. ; Yamada, H. ; Hashimoto, S. / Interrelating physical feature of facial expression and its impression. Proceedings of the International Joint Conference on Neural Networks. Vol. 3 2001. pp. 1864-1869
    @inproceedings{21d3062cacf442bb9df9467e824b7ab9,
    title = "Interrelating physical feature of facial expression and its impression",
    abstract = "The purpose of this work is to analyze human perception on the basis of a model of the interrelationship between physical feature of face and its emotional impression by using a unique neural network that is able to find mapping of non-linear relationship. Human's impressions of faces can be visualized on to the two or three dimensional semantic space that is nonlinearly mapped from facial physical parameters. Additionally, by obtaining the inverse nonlinear mapping, facial expressions can be reproduced from the semantic parameters. This approach does not only show good performance compared to the conventional statistical method, but also able to map a new data that are not used at the training phase to examine the appropriateness of the obtained mapping function.",
    author = "K. Suzuki and H. Yamada and S. Hashimoto",
    year = "2001",
    language = "English",
    volume = "3",
    pages = "1864--1869",
    booktitle = "Proceedings of the International Joint Conference on Neural Networks",

    }

    TY - GEN

    T1 - Interrelating physical feature of facial expression and its impression

    AU - Suzuki, K.

    AU - Yamada, H.

    AU - Hashimoto, S.

    PY - 2001

    Y1 - 2001

    N2 - The purpose of this work is to analyze human perception on the basis of a model of the interrelationship between physical feature of face and its emotional impression by using a unique neural network that is able to find mapping of non-linear relationship. Human's impressions of faces can be visualized on to the two or three dimensional semantic space that is nonlinearly mapped from facial physical parameters. Additionally, by obtaining the inverse nonlinear mapping, facial expressions can be reproduced from the semantic parameters. This approach does not only show good performance compared to the conventional statistical method, but also able to map a new data that are not used at the training phase to examine the appropriateness of the obtained mapping function.

    AB - The purpose of this work is to analyze human perception on the basis of a model of the interrelationship between physical feature of face and its emotional impression by using a unique neural network that is able to find mapping of non-linear relationship. Human's impressions of faces can be visualized on to the two or three dimensional semantic space that is nonlinearly mapped from facial physical parameters. Additionally, by obtaining the inverse nonlinear mapping, facial expressions can be reproduced from the semantic parameters. This approach does not only show good performance compared to the conventional statistical method, but also able to map a new data that are not used at the training phase to examine the appropriateness of the obtained mapping function.

    UR - http://www.scopus.com/inward/record.url?scp=0034862850&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=0034862850&partnerID=8YFLogxK

    M3 - Conference contribution

    AN - SCOPUS:0034862850

    VL - 3

    SP - 1864

    EP - 1869

    BT - Proceedings of the International Joint Conference on Neural Networks

    ER -