The complexity and dexterity of the human hand make the development of natural and robust control of hand prostheses challenging. Although a large number of control approaches were developed and investigated in the last decades, limited robustness in real-life conditions often prevented their application in clinical settings and in commercial products. In this paper, we investigate a multimodal approach that exploits the use of eye-hand coordination to improve the control of myoelectric hand prostheses. The analyzed data are from the publicly available MeganePro Dataset 1, that includes multimodal data from transradial amputees and able-bodied subjects while grasping numerous household objects with ten grasp types. A continuous grasp-type classification based on surface electromyography served as both intent detector and classifier. At the same time, the information provided by eye-hand coordination parameters, gaze data and object recognition in first-person videos allowed to identify the object a person aims to grasp. The results show that the inclusion of visual information significantly increases the average offline classification accuracy by up to 15.61 ± 4.22% for the transradial amputees and of up to 7.37 ± 3.52% for the able-bodied subjects, allowing trans-radial amputees to reach average classification accuracy comparable to intact subjects and suggesting that the robustness of hand prosthesis control based on grasp-type recognition can be significantly improved with the inclusion of visual information extracted by leveraging natural eye-hand coordination behavior and without placing additional cognitive burden on the user.


    Access

    Download


    Export, share and cite



    Title :

    Improving Robotic Hand Prosthesis Control With Eye Tracking and Computer Vision: A Multimodal Approach Based on the Visuomotor Behavior of Grasping


    Contributors:

    Publication date :

    2022-01-01


    Remarks:

    unige:165757
    ISSN: 2624-8212 ; Frontiers in artificial intelligence, vol. 4 (2022) 744476


    Type of media :

    Article (Journal)


    Type of material :

    Electronic Resource


    Language :

    English



    Classification :

    DDC:    629





    Multimodal Perception for Robotic Grasping and Pouring

    Liang, Hongzhuo / Universität Hamburg / Universität Hamburg, Fakultät für Mathematik, Informatik und Naturwissenschaften et al. | TIBKAT | 2022

    Free access

    Soft-grasping with an anthropomorphic robotic hand using spiking neurons

    Tieck, J. Camilo Vasquez / Secker, Katharina / Kaiser, Jacques et al. | BASE | 2020

    Free access

    Grasping and manipulation by robotic arm/multifingered-hand mechanisms

    Nagai, K. / Yoshikawa, T. | British Library Online Contents | 1995