Previous studies of robots used in learning environments suggest that the interaction between learner and robot is able to enhance the learning procedure towards a better engagement of the learner. Moreover, intelligent robots can also adapt their behavior during a learning process according to certain criteria resulting in increasing cognitive learning gains. Motivated by these results, we propose a novel Human Robot Interaction framework where the robot adjusts its behavior to the affect state of the learner. Our framework uses the theory of flow to label different affect states (i.e., engagement, boredom and frustration) and adapt the robot's actions. Based on the automatic recognition of these states, through visual cues, our method adapt the learning actions taking place at this moment and performed by the robot. This results in keeping the learner at most times engaged in the learning process. In order to recognizing the affect state of the user a two step approach is followed. Initially we recognize the facial expressions of the learner and therefore we map these to an affect state. Our algorithm perform well even in situations where the environment is noisy due to the presence of more than one person and/or situations where the face is partially occluded.


    Access

    Download


    Export, share and cite



    Title :

    Affect state recognition for adaptive human robot interaction in learning environments


    Contributors:

    Publication date :

    2017-08-31



    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English



    Classification :

    DDC:    629