One of the main aims of current social robotic research is to improve the robots’ abilities to interact with humans. In order to achieve an interaction similar to that among humans, robots should be able to communicate in an intuitive and natural way and appropriately interpret human affects during social interactions. Similarly to how humans are able to recognize emotions in other humans, machines are capable of extracting information from the various ways humans convey emotions—including facial expression, speech, gesture or text—and using this information for improved human computer interaction. This can be described as Affective Computing, an interdisciplinary field that expands into otherwise unrelated fields like psychology and cognitive science and involves the research and development of systems that can recognize and interpret human affects. To leverage these emotional capabilities by embedding them in humanoid robots is the foundation of the concept Affective Robots, which has the objective of making robots capable of sensing the user’s current mood and personality traits and adapt their behavior in the most appropriate manner based on that. In this paper, the emotion recognition capabilities of the humanoid robot Pepper are experimentally explored, based on the facial expressions for the so-called basic emotions, as well as how it performs in contrast to other state-of-the-art approaches with both expression databases compiled in academic environments and real subjects showing posed expressions as well as spontaneous emotional reactions. The experiments’ results show that the detection accuracy amongst the evaluated approaches differs substantially. The introduced experiments offer a general structure and approach for conducting such experimental evaluations. The paper further suggests that the most meaningful results are obtained by conducting experiments with real subjects expressing the emotions as spontaneous reactions.


    Access

    Download


    Export, share and cite



    Title :

    Affective Robots: Evaluation of Automatic Emotion Recognition Approaches on a Humanoid Robot towards Emotionally Intelligent Machines


    Contributors:

    Publication date :

    2018-04-04


    Remarks:

    oai:zenodo.org:1316752
    International Journal of Mechanical, Industrial and Aerospace Sciences 11.0(6)



    Type of media :

    Article (Journal)


    Type of material :

    Electronic Resource


    Language :

    English



    Classification :

    DDC:    629





    Context-Specific Affective and Cognitive Responses to Humanoid Robots

    Jung, Yoonhyuk / Cho, Eunae | BASE | 2018

    Free access

    Adaptive facial point detection and emotion recognition for a humanoid robot

    Zhang, Li / Mistry, Kamlesh / Jiang, Ming et al. | British Library Online Contents | 2015


    Adaptive facial point detection and emotion recognition for a humanoid robot

    Zhang, Li / Mistry, Kamlesh / Jiang, Ming et al. | British Library Online Contents | 2015


    Perception of emotion and emotional intensity in humanoid robots gait

    Destephe, Matthieu / Henning, Andreas / Zecca, Massimiliano et al. | IEEE | 2013