Human gesture analysis using multimodal features

Luo Dan, Hazim Kemal Ekenel, Ohya Jun

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Citations (Scopus)

Abstract

Human gesture as a natural interface plays an utmost important role for achieving intelligent Human Computer Interaction (HCI). Human gestures include different components of visual actions such as motion of hands, facial expression, and torso, to convey meaning. So far, in the field of gesture recognition, most previous works have focused on the manual component of gestures. In this paper, we present an appearance-based multimodal gesture recognition framework, which combines the different groups of features such as facial expression features and hand motion features which are extracted from image frames captured by a single web camera. We refer 12 classes of human gestures with facial expression including neutral, negative and positive meanings from American Sign Languages (ASL). We combine the features in two levels by employing two fusion strategies. At the feature level, an early feature combination can be performed by concatenating and weighting different feature groups, and PLS is used to choose the most discriminative elements by projecting the feature on a discriminative expression space. The second strategy is applied on decision level. Weighted decisions from single modalities are fused in a later stage. A condensation-based algorithm is adopted for classification. We collected a data set with three to seven recording sessions and conducted experiments with the combination techniques. Experimental results showed that facial analysis improve hand gesture recognition, decision level fusion performs better than feature level fusion.

Original languageEnglish
Title of host publicationProceedings of the 2012 IEEE International Conference on Multimedia and Expo Workshops, ICMEW 2012
Pages471-476
Number of pages6
DOIs
Publication statusPublished - 2012 Oct 4
Event2012 IEEE International Conference on Multimedia and Expo Workshops, ICMEW 2012 - Melbourne, VIC, Australia
Duration: 2012 Jul 92012 Jul 13

Publication series

NameProceedings of the 2012 IEEE International Conference on Multimedia and Expo Workshops, ICMEW 2012

Conference

Conference2012 IEEE International Conference on Multimedia and Expo Workshops, ICMEW 2012
CountryAustralia
CityMelbourne, VIC
Period12/7/912/7/13

Keywords

  • Condensation Algorithm
  • Facial Expression
  • Gesture Recognition

ASJC Scopus subject areas

  • Computer Graphics and Computer-Aided Design
  • Computer Vision and Pattern Recognition
  • Human-Computer Interaction

Fingerprint Dive into the research topics of 'Human gesture analysis using multimodal features'. Together they form a unique fingerprint.

  • Cite this

    Dan, L., Ekenel, H. K., & Jun, O. (2012). Human gesture analysis using multimodal features. In Proceedings of the 2012 IEEE International Conference on Multimedia and Expo Workshops, ICMEW 2012 (pp. 471-476). [6266429] (Proceedings of the 2012 IEEE International Conference on Multimedia and Expo Workshops, ICMEW 2012). https://doi.org/10.1109/ICMEW.2012.88