Facial analysis aided human gesture recognition for Human Computer Interaction

Dan Luo, Hua Gao, Hazum Kemal Ekenel, Jun Ohya

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Abstract

    Human gesture recognition systems are natural to use for achieving intelligent Human Computer Interaction (HCI). These systems should memorize the specific user and enable the user to gesture naturally without wearing special devices. Extracting different components of visual actions from human gestures, such as shape and motion of hand- facial expression and torso, is the key tasks in gesture recognition. So far, in the field of gesture n-cognition. most of the previous work have focused only on hand motion features and required the user lo wear special devices. In this paper, we present, an appearance-based multimodal gesture recognition framework, which combines the different modalities of features such as face identity, facial expression and hand motions which have been extracted from the image frames captured directly by a web camera. We refer 12 classes of human gestures with facial expression including neutral, negative (e.g. "angry") and positive (e.g. "excited") meanings from American Sign Language. A condensation-based algorithm is adopted for classification. We collected a data set with three recording sessions and conducted experiments with different combination techniques. Experimental results showed that the performance of hand gesture recognition is improved by adding facial analysis.

    Original languageEnglish
    Title of host publicationProceedings of the 12th IAPR Conference on Machine Vision Applications, MVA 2011
    Pages446-449
    Number of pages4
    Publication statusPublished - 2011
    Event12th IAPR Conference on Machine Vision Applications, MVA 2011 - Nara
    Duration: 2011 Jun 132011 Jun 15

    Other

    Other12th IAPR Conference on Machine Vision Applications, MVA 2011
    CityNara
    Period11/6/1311/6/15

    Fingerprint

    Gesture recognition
    Human computer interaction
    Condensation
    Computer systems
    Cameras
    Wear of materials
    Experiments

    ASJC Scopus subject areas

    • Computer Vision and Pattern Recognition

    Cite this

    Luo, D., Gao, H., Ekenel, H. K., & Ohya, J. (2011). Facial analysis aided human gesture recognition for Human Computer Interaction. In Proceedings of the 12th IAPR Conference on Machine Vision Applications, MVA 2011 (pp. 446-449)

    Facial analysis aided human gesture recognition for Human Computer Interaction. / Luo, Dan; Gao, Hua; Ekenel, Hazum Kemal; Ohya, Jun.

    Proceedings of the 12th IAPR Conference on Machine Vision Applications, MVA 2011. 2011. p. 446-449.

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Luo, D, Gao, H, Ekenel, HK & Ohya, J 2011, Facial analysis aided human gesture recognition for Human Computer Interaction. in Proceedings of the 12th IAPR Conference on Machine Vision Applications, MVA 2011. pp. 446-449, 12th IAPR Conference on Machine Vision Applications, MVA 2011, Nara, 11/6/13.
    Luo D, Gao H, Ekenel HK, Ohya J. Facial analysis aided human gesture recognition for Human Computer Interaction. In Proceedings of the 12th IAPR Conference on Machine Vision Applications, MVA 2011. 2011. p. 446-449
    Luo, Dan ; Gao, Hua ; Ekenel, Hazum Kemal ; Ohya, Jun. / Facial analysis aided human gesture recognition for Human Computer Interaction. Proceedings of the 12th IAPR Conference on Machine Vision Applications, MVA 2011. 2011. pp. 446-449
    @inproceedings{fc05b3099fca47d5ae22e3d452c6e1fc,
    title = "Facial analysis aided human gesture recognition for Human Computer Interaction",
    abstract = "Human gesture recognition systems are natural to use for achieving intelligent Human Computer Interaction (HCI). These systems should memorize the specific user and enable the user to gesture naturally without wearing special devices. Extracting different components of visual actions from human gestures, such as shape and motion of hand- facial expression and torso, is the key tasks in gesture recognition. So far, in the field of gesture n-cognition. most of the previous work have focused only on hand motion features and required the user lo wear special devices. In this paper, we present, an appearance-based multimodal gesture recognition framework, which combines the different modalities of features such as face identity, facial expression and hand motions which have been extracted from the image frames captured directly by a web camera. We refer 12 classes of human gestures with facial expression including neutral, negative (e.g. {"}angry{"}) and positive (e.g. {"}excited{"}) meanings from American Sign Language. A condensation-based algorithm is adopted for classification. We collected a data set with three recording sessions and conducted experiments with different combination techniques. Experimental results showed that the performance of hand gesture recognition is improved by adding facial analysis.",
    author = "Dan Luo and Hua Gao and Ekenel, {Hazum Kemal} and Jun Ohya",
    year = "2011",
    language = "English",
    isbn = "9784901122115",
    pages = "446--449",
    booktitle = "Proceedings of the 12th IAPR Conference on Machine Vision Applications, MVA 2011",

    }

    TY - GEN

    T1 - Facial analysis aided human gesture recognition for Human Computer Interaction

    AU - Luo, Dan

    AU - Gao, Hua

    AU - Ekenel, Hazum Kemal

    AU - Ohya, Jun

    PY - 2011

    Y1 - 2011

    N2 - Human gesture recognition systems are natural to use for achieving intelligent Human Computer Interaction (HCI). These systems should memorize the specific user and enable the user to gesture naturally without wearing special devices. Extracting different components of visual actions from human gestures, such as shape and motion of hand- facial expression and torso, is the key tasks in gesture recognition. So far, in the field of gesture n-cognition. most of the previous work have focused only on hand motion features and required the user lo wear special devices. In this paper, we present, an appearance-based multimodal gesture recognition framework, which combines the different modalities of features such as face identity, facial expression and hand motions which have been extracted from the image frames captured directly by a web camera. We refer 12 classes of human gestures with facial expression including neutral, negative (e.g. "angry") and positive (e.g. "excited") meanings from American Sign Language. A condensation-based algorithm is adopted for classification. We collected a data set with three recording sessions and conducted experiments with different combination techniques. Experimental results showed that the performance of hand gesture recognition is improved by adding facial analysis.

    AB - Human gesture recognition systems are natural to use for achieving intelligent Human Computer Interaction (HCI). These systems should memorize the specific user and enable the user to gesture naturally without wearing special devices. Extracting different components of visual actions from human gestures, such as shape and motion of hand- facial expression and torso, is the key tasks in gesture recognition. So far, in the field of gesture n-cognition. most of the previous work have focused only on hand motion features and required the user lo wear special devices. In this paper, we present, an appearance-based multimodal gesture recognition framework, which combines the different modalities of features such as face identity, facial expression and hand motions which have been extracted from the image frames captured directly by a web camera. We refer 12 classes of human gestures with facial expression including neutral, negative (e.g. "angry") and positive (e.g. "excited") meanings from American Sign Language. A condensation-based algorithm is adopted for classification. We collected a data set with three recording sessions and conducted experiments with different combination techniques. Experimental results showed that the performance of hand gesture recognition is improved by adding facial analysis.

    UR - http://www.scopus.com/inward/record.url?scp=84872520023&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=84872520023&partnerID=8YFLogxK

    M3 - Conference contribution

    SN - 9784901122115

    SP - 446

    EP - 449

    BT - Proceedings of the 12th IAPR Conference on Machine Vision Applications, MVA 2011

    ER -