Human-Robot Interaction (HRI) is defined as the study of interactions between humans and robots: it involves several different disciplines like computer science, engineering, social sciences and psychology. For HRI, the perceptual challenges are particularly complex, because of the need to perceive, understand, and react to human activity in real-time. The main key aspects of the perception are multimodality and attention. Multimodality allows humans to move seamlessly between different modes of interaction, from visual to voice to touch, according to changes in context or user preference, while attention is the cognitive process of selectively concentrating on one aspect of the environment while ignoring other things. Multimodality and attention play a fundamental role in HRI also. Multimodality allows robot to interpret and react to various humans' stimuli (e.g. gesture, speech, eye gaze) while, on the other hand, implementing attentional models in robot control behavior allows robot to save computational resources and react in real time by selectively processing the salient perceived stimuli. The intention of this thesis is to present novel methods for human gestures recognition including pointing gestures, that are fundamental when interacting with mobile robots, and a robot attentional regulation mechanism that is speech driven. In the context of continuous gesture recognition the aim is to provide a system that can be trained online with few samples and can cope with intra user variability during the gesture execution. The proposed approach relies on the generation of an ad-hoc Hidden Markov Model (HMM) for each gesture exploiting a direct estimation of the parameters. Each model represents the best prototype candidate from the associated gesture training set. The generated models are then employed within a continuous recognition process that provides the probability of each gesture at each step. A pointing gesture recognition computational method is also presented, such model is based on the combined approach a geometrical solution and a machine learning solution. Once the gesture recognition models are described, a human-robot interaction system that exploits emotion and attention to regulate and adapt the robotic interactive behavior is proposed. In particular, the system is focused on the relation between arousal, predictability, and attentional allocation considering as a case study a robotic manipulator interacting with a human operator.


    Zugriff

    Download


    Exportieren, teilen und zitieren



    Titel :

    Human Gesture Recognition and Robot Attentional Regulation for Human-Robot Interaction


    Beteiligte:

    Erscheinungsdatum :

    2014-03-31


    Anmerkungen:

    Iengo, Salvatore (2014) Human Gesture Recognition and Robot Attentional Regulation for Human-Robot Interaction. [Tesi di dottorato]


    Medientyp :

    Hochschulschrift


    Format :

    Elektronische Ressource


    Sprache :

    Italienisch , Englisch


    Klassifikation :

    DDC:    629



    Human-Robot Interaction : An Arm Gesture-Based Approach

    Venturi, Sai Siri Sree / Bojja, Poorna Teja | BASE | 2023

    Freier Zugriff

    A Gesture Based Interface for Human-Robot Interaction

    Waldherr, S. / Romero, R. / Thrun, S. | British Library Online Contents | 2001


    Human-Robot Interaction Through Gesture-Free Spoken Dialogue

    Kulyukin, V. | British Library Online Contents | 2004


    Arm Gesture Generation of Humanoid Robot Mybot-KSR for Human Robot Interaction

    Lee, Seung-Jae / Jung, Chang-Young / Yoo, Bum-Soo et al. | Springer Verlag | 2013


    Arm Gesture Generation of Humanoid Robot Mybot-KSR for Human Robot Interaction

    Lee, S.-J. / Jung, C.-Y. / Yoo, B.-S. et al. | British Library Conference Proceedings | 2013