The need for experimentation of facial expression recognition in a more ecological manner necessitates the use of multimodal, interactive experimental stimuli. At the same time, the prerequisite of reproducibility of results and controlled conditions is still mandatory. An embodied conversational agent (ECA) is a pertinent framework that meets all these requirements. The VIB (Virtual Interactive Behavior) Platform is a SAIBA compliant system which supports the real-time generation of multimodal behavior for interacting with socio-emotional virtual agents. We created a new feature for this platform, namely VIB-Ex, which can be used for presenting real-time facial expressions and recording the user's reaction time and interaction while exporting data for statistical purposes. In this paper, we present our proof of concept study in which a 3D male virtual character has been used to convey joyful or sad facial expressions. At the same time, the same character pronounced joyful or sad words in congruence or incongruence with its facial expression in order to trigger an emotional Stroop effect. Only 12 adults were sufficient in order to obtain an emotional Stroop effect within our virtual agent. The results of this study confirmed that the VIB-Ex platform can replicate a robust effect of psychological phenomena concerning recognition of facial expressions. VIB-Ex proves itself to be a suitable and a pertinent tool to perform experiments on a human's automatic process of facial expression recognition. Finally, we discuss the possible future research topics with VIB-Ex to carry out other type of experiments in the field of social cognition.
ASJC Scopus subject areas
- Experimental and Cognitive Psychology
- Cognitive Neuroscience
- Artificial Intelligence