Allocentric emotional affordances in HRI: The multimodal binding

Jordi Vallverdú, Gabriele Trovato, Lorenzo Jamone

Research output: Contribution to journalArticle

2 Citations (Scopus)

Abstract

The concept of affordance perception is one of the distinctive traits of human cognition; and its application to robots can dramatically improve the quality of human-robot interaction (HRI). In this paper we explore and discuss the idea of “emotional affordances” by proposing a viable model for implementation into HRI; which considers allocentric and multimodal perception. We consider “2-ways” affordances: perceived object triggering an emotion; and perceived human emotion expression triggering an action. In order to make the implementation generic; the proposed model includes a library that can be customised depending on the specific robot and application scenario. We present the AAA (Affordance-Appraisal-Arousal) model; which incorporates Plutchik’s Wheel of Emotions; and we outline some numerical examples of how it can be used in different scenarios.

Original languageEnglish
Article number78
JournalMultimodal Technologies and Interaction
Volume2
Issue number4
DOIs
Publication statusPublished - 2018 Dec

Keywords

  • Affordance
  • Allocentric
  • Emotion
  • Empathy
  • HRI
  • Libraries
  • Multimodal

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Computer Science Applications
  • Human-Computer Interaction
  • Neuroscience (miscellaneous)

Fingerprint Dive into the research topics of 'Allocentric emotional affordances in HRI: The multimodal binding'. Together they form a unique fingerprint.

  • Cite this