TY - GEN
T1 - Automatic discrimination of laughter using distributed sEMG
AU - Cosentino, S.
AU - Sessa, S.
AU - Kong, W.
AU - Zhang, Di
AU - Takanishi, A.
AU - Bianchi-Berthouze, N.
N1 - Funding Information:
This research has been supported by the JSPS Grant-in-Aid for Young Scientists (Wakate B) [25750259] and [15K21437], and MEXT, Japan.
Publisher Copyright:
© 2015 IEEE.
PY - 2015/12/2
Y1 - 2015/12/2
N2 - Laughter is a very interesting non-verbal human vocalization. It is classified as a semi voluntary behavior despite being a direct form of social interaction, and can be elicited by a variety of very different stimuli, both cognitive and physical. Automatic laughter detection, analysis and classification will boost progress in affective computing, leading to the development of more natural human-machine communication interfaces. Surface Electromyography (sEMG) on abdominal muscles or invasive EMG on the larynx show potential in this direction, but these kinds of EMG-based sensing systems cannot be used in ecological settings due to their size, lack of reusability and uncomfortable setup. For this reason, they cannot be easily used for natural detection and measurement of a volatile social behavior like laughter in a variety of different situations. We propose the use of miniaturized, wireless, dry-electrode sEMG sensors on the neck for the detection and analysis of laughter. Even if with this solution the activation of specific larynx muscles cannot be precisely measured, it is possible to detect different EMG patterns related to larynx function. In addition, integrating sEMG analysis on a multisensory compact system positioned on the neck would improve the overall robustness of the whole sensing system, enabling the synchronized measure of different characteristics of laughter, like vocal production, head movement or facial expression; being at the same time less intrusive, as the neck is normally more accessible than abdominal muscles. In this paper, we report laughter discrimination rate obtained with our system depending on different conditions.
AB - Laughter is a very interesting non-verbal human vocalization. It is classified as a semi voluntary behavior despite being a direct form of social interaction, and can be elicited by a variety of very different stimuli, both cognitive and physical. Automatic laughter detection, analysis and classification will boost progress in affective computing, leading to the development of more natural human-machine communication interfaces. Surface Electromyography (sEMG) on abdominal muscles or invasive EMG on the larynx show potential in this direction, but these kinds of EMG-based sensing systems cannot be used in ecological settings due to their size, lack of reusability and uncomfortable setup. For this reason, they cannot be easily used for natural detection and measurement of a volatile social behavior like laughter in a variety of different situations. We propose the use of miniaturized, wireless, dry-electrode sEMG sensors on the neck for the detection and analysis of laughter. Even if with this solution the activation of specific larynx muscles cannot be precisely measured, it is possible to detect different EMG patterns related to larynx function. In addition, integrating sEMG analysis on a multisensory compact system positioned on the neck would improve the overall robustness of the whole sensing system, enabling the synchronized measure of different characteristics of laughter, like vocal production, head movement or facial expression; being at the same time less intrusive, as the neck is normally more accessible than abdominal muscles. In this paper, we report laughter discrimination rate obtained with our system depending on different conditions.
KW - EMG
KW - affective computing
KW - electromyography
KW - laughter
KW - laughter computing
UR - http://www.scopus.com/inward/record.url?scp=84964047130&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84964047130&partnerID=8YFLogxK
U2 - 10.1109/ACII.2015.7344644
DO - 10.1109/ACII.2015.7344644
M3 - Conference contribution
AN - SCOPUS:84964047130
T3 - 2015 International Conference on Affective Computing and Intelligent Interaction, ACII 2015
SP - 691
EP - 697
BT - 2015 International Conference on Affective Computing and Intelligent Interaction, ACII 2015
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2015 International Conference on Affective Computing and Intelligent Interaction, ACII 2015
Y2 - 21 September 2015 through 24 September 2015
ER -