Fast and reliable communication between human workers and robotic assistants (RAs) is essential for successful collaboration between these agents. This is especially true for typically noisy manufacturing environments that render verbal communication less effective. This thesis investigates the efficacy of nonverbal communication capabilities of robotic manipulators that have poseable, three-fingered end-effectors (hands). This work explores the extent to which different poses of a typical robotic gripper can effectively communicate instructional messages during human-robot collaboration. Within the context of a collaborative car door assembly task, a series of three studies were conducted. Study 1 empirically explored the type of hand configurations that humans use to nonverbally instruct another per- son (N=17). Based on the findings from Study 1, Study 2 examined how well human gestures with frequently used hand configurations were under- stood by recipients of the message (N=140). Finally, Study 3 implemented the most human-recognized human hand configurations on a 7-degree-of- freedom (DOF) robotic manipulator to investigate the efficacy of having human-inspired hand poses on a robotic hand compared to an unposed hand (N=100). Contributions of this work include the presentation of a set of hand configurations humans commonly use to instruct another person in a collaborative assembly scenario, as well as Recognition Rate and Recognition Confidence measures for the gestures that humans and robots expressed using different hand configurations. These experimental results indicate that most gestures are better recognized with a higher level of confidence when displayed with a posed robot hand. Guidelines and principles are provided based on these results for the mechanical design of robotic hands. ; Applied Science, Faculty of ; Mechanical Engineering, Department of ; Graduate


    Zugriff

    Download


    Exportieren, teilen und zitieren



    Titel :

    Data-driven design of expressive robot hands and hand gestures : applications for collaborative human-robot interaction


    Beteiligte:

    Erscheinungsdatum :

    01.01.2017


    Medientyp :

    Hochschulschrift


    Format :

    Elektronische Ressource


    Sprache :

    Englisch


    Klassifikation :

    DDC:    629




    Control of Robot with Hand Gestures

    Sinkevicius, V. / Fiodorova, O. / Kauno Technologijos universitetas | British Library Conference Proceedings | 2011


    Recognizing touch gestures for social human-robot interaction

    Altuğlu, Tuğçe Ballı / Altun, Kerem | BASE | 2015

    Freier Zugriff

    Visual recognition of pointing gestures for human-robot interaction

    Nickel, K. / Stiefelhagen, R. | British Library Online Contents | 2007


    On the role of gestures in human-robot interaction

    CARFI', ALESSANDRO | BASE | 2020

    Freier Zugriff

    On the Design of Human-Robot Collaboration Gestures

    Shrinah, Anas / Bahraini, Masoud S. / Khan, Fahad et al. | ArXiv | 2024

    Freier Zugriff