Brain response to a humanoid robot in areas implicated in the perception of human emotional gestures

Thierry Chaminade, Massimiliano Zecca, Sarah Jayne Blakemore, Atsuo Takanishi, Chris D. Frith, Silvestro Micera, Paolo Dario, Giacomo Rizzolatti, Vittorio Gallese, Maria Alessandra Umiltà

    Research output: Contribution to journalArticle

    60 Citations (Scopus)

    Abstract

    Background: The humanoid robot WE4-RII was designed to express human emotions in order to improve human-robot interaction. We can read the emotions depicted in its gestures, yet might utilize different neural processes than those used for reading the emotions in human agents. Methodology: Here, fMRI was used to assess how brain areas activated by the perception of human basic emotions (facial expression of Anger, Joy, Disgust) and silent speech respond to a humanoid robot impersonating the same emotions, while participants were instructed to attend either to the emotion or to the motion depicted. Principal Findings: Increased responses to robot compared to human stimuli in the occipital and posterior temporal cortices suggest additional visual processing when perceiving a mechanical anthropomorphic agent. In contrast, activity in cortical areas endowed with mirror properties, like left Broca's area for the perception of speech, and in the processing of emotions like the left anterior insula for the perception of disgust and the orbitofrontal cortex for the perception of anger, is reduced for robot stimuli, suggesting lesser resonance with the mechanical agent. Finally, instructions to explicitly attend to the emotion significantly increased response to robot, but not human facial expressions in the anterior part of the left inferior frontal gyrus, a neural marker of motor resonance. Conclusions: Motor resonance towards a humanoid robot, but not a human, display of facial emotion is increased when attention is directed towards judging emotions. Significance: Artificial agents can be used to assess how factors like anthropomorphism affect neural response to the perception of human actions.

    Original languageEnglish
    Article numbere11577
    JournalPLoS One
    Volume5
    Issue number7
    DOIs
    Publication statusPublished - 2010

    Fingerprint

    Gestures
    robots
    emotions
    Brain
    Emotions
    Robots
    brain
    Facial Expression
    Anger
    Human robot interaction
    Prefrontal Cortex
    Processing
    Mirrors
    Display devices
    Temporal Lobe
    Reading
    cortex
    Magnetic Resonance Imaging

    ASJC Scopus subject areas

    • Agricultural and Biological Sciences(all)
    • Biochemistry, Genetics and Molecular Biology(all)
    • Medicine(all)

    Cite this

    Chaminade, T., Zecca, M., Blakemore, S. J., Takanishi, A., Frith, C. D., Micera, S., ... Umiltà, M. A. (2010). Brain response to a humanoid robot in areas implicated in the perception of human emotional gestures. PLoS One, 5(7), [e11577]. https://doi.org/10.1371/journal.pone.0011577

    Brain response to a humanoid robot in areas implicated in the perception of human emotional gestures. / Chaminade, Thierry; Zecca, Massimiliano; Blakemore, Sarah Jayne; Takanishi, Atsuo; Frith, Chris D.; Micera, Silvestro; Dario, Paolo; Rizzolatti, Giacomo; Gallese, Vittorio; Umiltà, Maria Alessandra.

    In: PLoS One, Vol. 5, No. 7, e11577, 2010.

    Research output: Contribution to journalArticle

    Chaminade, T, Zecca, M, Blakemore, SJ, Takanishi, A, Frith, CD, Micera, S, Dario, P, Rizzolatti, G, Gallese, V & Umiltà, MA 2010, 'Brain response to a humanoid robot in areas implicated in the perception of human emotional gestures', PLoS One, vol. 5, no. 7, e11577. https://doi.org/10.1371/journal.pone.0011577
    Chaminade, Thierry ; Zecca, Massimiliano ; Blakemore, Sarah Jayne ; Takanishi, Atsuo ; Frith, Chris D. ; Micera, Silvestro ; Dario, Paolo ; Rizzolatti, Giacomo ; Gallese, Vittorio ; Umiltà, Maria Alessandra. / Brain response to a humanoid robot in areas implicated in the perception of human emotional gestures. In: PLoS One. 2010 ; Vol. 5, No. 7.
    @article{e22494757d544f7b97bd2eb2e184c76b,
    title = "Brain response to a humanoid robot in areas implicated in the perception of human emotional gestures",
    abstract = "Background: The humanoid robot WE4-RII was designed to express human emotions in order to improve human-robot interaction. We can read the emotions depicted in its gestures, yet might utilize different neural processes than those used for reading the emotions in human agents. Methodology: Here, fMRI was used to assess how brain areas activated by the perception of human basic emotions (facial expression of Anger, Joy, Disgust) and silent speech respond to a humanoid robot impersonating the same emotions, while participants were instructed to attend either to the emotion or to the motion depicted. Principal Findings: Increased responses to robot compared to human stimuli in the occipital and posterior temporal cortices suggest additional visual processing when perceiving a mechanical anthropomorphic agent. In contrast, activity in cortical areas endowed with mirror properties, like left Broca's area for the perception of speech, and in the processing of emotions like the left anterior insula for the perception of disgust and the orbitofrontal cortex for the perception of anger, is reduced for robot stimuli, suggesting lesser resonance with the mechanical agent. Finally, instructions to explicitly attend to the emotion significantly increased response to robot, but not human facial expressions in the anterior part of the left inferior frontal gyrus, a neural marker of motor resonance. Conclusions: Motor resonance towards a humanoid robot, but not a human, display of facial emotion is increased when attention is directed towards judging emotions. Significance: Artificial agents can be used to assess how factors like anthropomorphism affect neural response to the perception of human actions.",
    author = "Thierry Chaminade and Massimiliano Zecca and Blakemore, {Sarah Jayne} and Atsuo Takanishi and Frith, {Chris D.} and Silvestro Micera and Paolo Dario and Giacomo Rizzolatti and Vittorio Gallese and Umilt{\`a}, {Maria Alessandra}",
    year = "2010",
    doi = "10.1371/journal.pone.0011577",
    language = "English",
    volume = "5",
    journal = "PLoS One",
    issn = "1932-6203",
    publisher = "Public Library of Science",
    number = "7",

    }

    TY - JOUR

    T1 - Brain response to a humanoid robot in areas implicated in the perception of human emotional gestures

    AU - Chaminade, Thierry

    AU - Zecca, Massimiliano

    AU - Blakemore, Sarah Jayne

    AU - Takanishi, Atsuo

    AU - Frith, Chris D.

    AU - Micera, Silvestro

    AU - Dario, Paolo

    AU - Rizzolatti, Giacomo

    AU - Gallese, Vittorio

    AU - Umiltà, Maria Alessandra

    PY - 2010

    Y1 - 2010

    N2 - Background: The humanoid robot WE4-RII was designed to express human emotions in order to improve human-robot interaction. We can read the emotions depicted in its gestures, yet might utilize different neural processes than those used for reading the emotions in human agents. Methodology: Here, fMRI was used to assess how brain areas activated by the perception of human basic emotions (facial expression of Anger, Joy, Disgust) and silent speech respond to a humanoid robot impersonating the same emotions, while participants were instructed to attend either to the emotion or to the motion depicted. Principal Findings: Increased responses to robot compared to human stimuli in the occipital and posterior temporal cortices suggest additional visual processing when perceiving a mechanical anthropomorphic agent. In contrast, activity in cortical areas endowed with mirror properties, like left Broca's area for the perception of speech, and in the processing of emotions like the left anterior insula for the perception of disgust and the orbitofrontal cortex for the perception of anger, is reduced for robot stimuli, suggesting lesser resonance with the mechanical agent. Finally, instructions to explicitly attend to the emotion significantly increased response to robot, but not human facial expressions in the anterior part of the left inferior frontal gyrus, a neural marker of motor resonance. Conclusions: Motor resonance towards a humanoid robot, but not a human, display of facial emotion is increased when attention is directed towards judging emotions. Significance: Artificial agents can be used to assess how factors like anthropomorphism affect neural response to the perception of human actions.

    AB - Background: The humanoid robot WE4-RII was designed to express human emotions in order to improve human-robot interaction. We can read the emotions depicted in its gestures, yet might utilize different neural processes than those used for reading the emotions in human agents. Methodology: Here, fMRI was used to assess how brain areas activated by the perception of human basic emotions (facial expression of Anger, Joy, Disgust) and silent speech respond to a humanoid robot impersonating the same emotions, while participants were instructed to attend either to the emotion or to the motion depicted. Principal Findings: Increased responses to robot compared to human stimuli in the occipital and posterior temporal cortices suggest additional visual processing when perceiving a mechanical anthropomorphic agent. In contrast, activity in cortical areas endowed with mirror properties, like left Broca's area for the perception of speech, and in the processing of emotions like the left anterior insula for the perception of disgust and the orbitofrontal cortex for the perception of anger, is reduced for robot stimuli, suggesting lesser resonance with the mechanical agent. Finally, instructions to explicitly attend to the emotion significantly increased response to robot, but not human facial expressions in the anterior part of the left inferior frontal gyrus, a neural marker of motor resonance. Conclusions: Motor resonance towards a humanoid robot, but not a human, display of facial emotion is increased when attention is directed towards judging emotions. Significance: Artificial agents can be used to assess how factors like anthropomorphism affect neural response to the perception of human actions.

    UR - http://www.scopus.com/inward/record.url?scp=77955402750&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=77955402750&partnerID=8YFLogxK

    U2 - 10.1371/journal.pone.0011577

    DO - 10.1371/journal.pone.0011577

    M3 - Article

    C2 - 20657777

    AN - SCOPUS:77955402750

    VL - 5

    JO - PLoS One

    JF - PLoS One

    SN - 1932-6203

    IS - 7

    M1 - e11577

    ER -