Successful social robot services depend on how robots can interact with users. The effective service can be
obtained through smooth, engaged and humanoid interactions in which robots react properly to a user’s affective
state. This paper proposes a novel Automatic Cognitive Empathy Model, ACEM, for humanoid robots to achieve
longer and more engaged human-robot interactions (HRI) by considering human’s emotions and replying to
them appropriately. The proposed model continuously detects the affective states of a user based on facial
expressions and generates desired, either parallel or reactive, empathic behaviors that are already adapted to the
user’s personality. Users’ affective states are detected using a stacked autoencoder network that is trained and
tested on the RAVDESS dataset.
The overall proposed empathic model is verified throughout an experiment, where different emotions are
triggered in participants and then empathic behaviors are applied based on proposed hypothesis. The results
confirm the effectiveness of the proposed model in terms of related social and friendship concepts that participants
perceived during interaction with the robot.
Original languageEnglish
JournalACM Transactions on Interactive Intelligent Systems
Publication statusAccepted/In press - 8 Mar 2020

    Research areas

  • Empathy, Non-verbal Behavior, Adaptive Interaction, Facial Emotion Detection, Social Robots, human robot interaction (HRI)

ID: 49665459