2015 ACM International Conference on Multimodal Interaction -- NOV 09-13, 2015 -- Seattle, WA ; Altun, Kerem/0000-0002-5493-8921 ; WOS:000380609500071 ; In this study, we performed touch gesture recognition on two sets of data provided by "Recognition of Social Touch Gestures Challenge 2015". For the first dataset, dubbed Corpus of Social Touch (CoST), touch is performed on a mannequin arm, whereas for the second dataset (Human-Animal Affective Robot Touch HAART) touch is performed in a human-pet interaction setting. CoST includes 14 gestures and HAART includes 7 gestures. We used the pressure data, image features, Hurst exponent, Hjorth parameters and autoregressive model coefficients as features, and performed feature selection using sequential forward floating search. We obtained classification results around 60%-70% for the HAART dataset. For the CoST dataset, the results range from 26% to 95% depending on the selection of the training/test sets. ; ACM SIGCHI


    Access

    Download


    Export, share and cite



    Title :

    Recognizing touch gestures for social human-robot interaction



    Publication date :

    2015-01-01



    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English



    Classification :

    DDC:    004 / 629



    Social Touch in Human–Robot Interaction : Symbiotic touch interaction between human and robot

    SHIOMI, Masahiro ;Sumioka, Hidenobu | GWLB - Gottfried Wilhelm Leibniz Bibliothek | 2024

    Free access



    Social Touch in Human-Robot Interaction : Symbiotic touch interaction between human and robot

    Shiomi, Masahiro ;Sumioka, Hidenobu | TIBKAT | 2024

    Free access

    Recognizing affect in human touch of a robot

    Altun, Kerem / MacLean, Karon E. | BASE | 2015

    Free access