Gesturing is an important modality in human–robot interaction. Up to date, gestures are often implemented for a specific robot configuration and therefore not easily transferable to other robots. To cope with this issue, we presented a generic method to calculate gestures for social robots. The method was designed to work in two modes to allow the calculation of different types of gestures. In this paper, we present the new developments of the method. We discuss how the two working modes can be combined to generate blended emotional expressions and deictic gestures. In certain situations, it is desirable to express an emotional condition through an ongoing functional behavior. Therefore, we implemented the possibility of modulating a pointing or reaching gesture into an affective gesture by influencing the motion speed and amplitude of the posture. The new implementations were validated on virtual models with different configurations, including those of the robots NAO and Justin.

Original languageEnglish
Pages (from-to)569-580
Number of pages12
JournalAutonomous Robots
Volume42
Issue number3
DOIs
Publication statusPublished - 1 Mar 2018

    Research areas

  • Affective gesture, Generic gesture system, Gestures, Pointing, Upper body postures

ID: 28438010