When navigating in a shared environment, the extent to which robots are able to effectively use signals for coordinating with human behaviors can ameliorate dissatisfaction and increase acceptance. In this paper, we present an online video study to investigate whether familiar acoustic signals can improve the legibility of a robot's navigation behavior. We collected the responses of 120 participants to evaluate their perceptions of a robot that communicates with one of the three used non-verbal navigational cues (an acoustic signal, an acoustic in pair with a visual signal, and a dissimilar frequency acoustic signal). Our results showed a significant legibility improvement when the robot used both light and acoustic signals to communicate its intentions compared to using only the same acoustic sound. Additionally, our findings highlighted that people also perceived differently the robot's intentions when they were expressed through two frequencies of the mere sound. The results of this work suggest a paradigm that can help the development of mobile service robots in public spaces.


    Zugriff

    Download


    Exportieren, teilen und zitieren



    Titel :

    Familiar Acoustic Cues for Legible Service Robots


    Beteiligte:

    Erscheinungsdatum :

    2022-01-01



    Medientyp :

    Aufsatz (Konferenz)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch


    Klassifikation :

    DDC:    629



    Policy regularization for legible behavior

    Persiani, Michele / Hellström, Thomas | BASE | 2023

    Freier Zugriff

    Legible Action Selection in Human-Robot Collaboration

    Zhu, Huaijiang / Gabler, Volker / Wollherr, Dirk | BASE | 2017

    Freier Zugriff


    A Highly Legible 6″ Multicolor Information LCD for Diagnostics and Navigation Uses

    Araki, Hiroshi / Tsubota, Hiroyoshi / Aoyama, Masao et al. | SAE Technical Papers | 1989