Abstract: Human gazes reveal the information of attention during the process of completing a task. Eye tracking glasses provide the human gaze in real time. It allows humans to communicate with robots by eyes. This enables the robots have the possibility to interpret the human intention via gaze information and react to it. The gaze intention can further help the robots to understand the human actions and tasks so that robots could help the humans to complete the actions and tasks. This thesis focuses on the gaze-based Human-Robot Interaction (HRI). First, we address the problem of predicting the human visual intention and recognizing gaze gesture for the application in HRI scenarios. Additionally, we use gaze as an additional modality to recognize human actions and develop an approach to predict what task or activity a human is performing. Furthermore, we consider one of the safety issues in HRI. We introduce a method based on artificial intelligence for human hands avoidance when a human is closely working around a robot. It allows the robot to move away from human hands when they are too close to the robot and keep the efficiency of the robotic task at the same time.


    Zugriff

    Download


    Exportieren, teilen und zitieren



    Titel :

    Gaze-based human-robot interaction and task learning



    Erscheinungsdatum :

    2022-01-01


    Medientyp :

    Hochschulschrift


    Format :

    Elektronische Ressource


    Sprache :

    Englisch



    Klassifikation :

    DDC:    629



    Gaze detection in human-robot interaction

    Alanenpää, Madelene | BASE | 2020

    Freier Zugriff

    Visual intention classification by deep learning for gaze-based human-robot interaction

    Shi, Lei / Copot, Cosmin / Vanlanduit, Steve | BASE | 2020

    Freier Zugriff

    GazeEMD : detecting visual intention in gaze-based human-robot interaction

    Shi, Lei / Copot, Cosmin / Vanlanduit, Steve | BASE | 2021

    Freier Zugriff


    Human-Robot Interaction Based on Gaze Gestures for the Drone Teleoperation

    Yu, Mingxin / Lin, Yingzi / Schmidt, David et al. | BASE | 2014

    Freier Zugriff