In this paper, an upper limb rehabilitation robot based on virtual reality is developed to improve therapeutic efficacy. With the help of the robot, the patients can reach virtual items appearing randomly with impaired limbs in a virtual environment. The kinematics of the robot is analyzed and the appearance rules are made to avoid mechanical interference between the arms. Both passive mode and active mode are introduced into the rehabilitation system to ensure this system can be used in all clinical stages. The simulation results show that the upper limb rehabilitation robot can generate appropriate trajectories to improve therapeutic efficacy according to the motion planning methods based on kinematics analysis.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Rehabilitative motion planning for upper limb rehabilitation robot based on virtual reality


    Contributors:
    Zhang, Xiaojun (author) / Zhang, Jianhua (author) / Sun, Lingyu (author) / Li, Manhong (author)


    Publication date :

    2013-12-01


    Size :

    675703 byte




    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English



    Research on motion capture of upper limb rehabilitation robot

    Chen, Lei / Lin, Mingxing / Deng, Quan et al. | IEEE | 2019


    Task-Based Trajectory Planning for an Exoskeleton Upper Limb Rehabilitation Robot

    Meng, Qiaoling / Shao, Haicun / Wang, Lulu et al. | Springer Verlag | 2018


    Kinematics Analysis and Trajectory Planning of Upper Limb Rehabilitation Robot

    Wang, Zhiming / Chang, Yanyan / Sui, Xueting | IEEE | 2017


    Design and Analysis of Lower Limb Rehabilitation Robot Based on Virtual Reality

    Shi, Xiaohua / Wang, Yajing / Liu, Ruifa et al. | Springer Verlag | 2023


    Upper limb Exoskeletons for Motor Rehabilitation using Virtual Reality: A Technological Review

    Huamanchahua, Deyby / Loayza-Bautista, Sebastian / Sanchez-Vilchez, Daniel et al. | IEEE | 2023