To help enable robot-augmented production, where robots assist production staff to relieve them of repetitive and strenuous tasks, the purpose of the research is enabling real-time human-robot trust assessment by inferring decreases human trust from signs of physical apprehension. To ensure safe and productive human-robot collaboration we have to ensure an appropriate level of trust in the robot, as too much trust can lead to dangerous situations, whereas too little trust can lead to loss in productivity. The main hypothesis is that if the user experiences a decrease in trust, they will increase their distance from the robot by stepping or leaning away from it. A series of experiments were performed using a Rethink Robotics Sawyer robot and an augmented reality enabled human-robot collaboration cell, using projection to display task critical information within the shared work space. Participants performed repeated tasks with the robot, midway through which the robot would disrupt the participants' expectations in order to decrease their trust towards it. Their movements were assessed using an infrared camera for body tracking to correlate it with decreases in trust.


    Zugriff

    Download


    Exportieren, teilen und zitieren



    Titel :

    Human-Robot Trust Assessment From Physical Apprehension Signals


    Beteiligte:
    Hald, Kasper (Autor:in)

    Erscheinungsdatum :

    01.01.2021


    Anmerkungen:

    Hald , K 2021 , Human-Robot Trust Assessment From Physical Apprehension Signals . Ph.d.-serien for Det Tekniske Fakultet for IT og Design, Aalborg Universitet , Aalborg Universitetsforlag .


    Medientyp :

    Buch


    Format :

    Elektronische Ressource


    Sprache :

    Englisch



    Klassifikation :

    DDC:    629



    Apprehension and Panic

    A. J. Bachrach / G. H. Egstrom | NTIS | 1977




    Object apprehension using vision and touch

    Bajcsy, R. / Stansfield, S. A. | NTRS | 1987


    Trust in Industrial Human–Robot Collaboration

    Charalambous, George / Fletcher, Sarah R. | Springer Verlag | 2021