Within the work presented here, the demonstrator AMiCUS (Adaptive Head Motion Control for User-friendly Support) has been developed. AMiCUS senses the head motions of a tetraplegic user and uses these to control a robot arm in real-time. At first, a MEMS-based Attitude Heading Reference System (AHRS) suitable for head control has been chosen. Next, a solution has been proposed how to attach the AHRS to the head of a potential user to measure his head motion. A control structure has been developed that enables intuitive real-time control of a robot arm in Cartesian space using the 3 input signals provided by head motion. It has been taken into account that the Range of Motion (ROM) of a potential user can be restricted. Therefore, the control paradigm can be adapted to the individual user, using his full available ROM. Moreover, possibilities to generate control commands which can be integrated consistently into the existing control paradigm have been investigated in order to be able to turn the system on and off, and to perform switching operations using solely head motion. On this basis, the first version of the demonstrator, AMiCUS alpha v.1, has been realized. To allow safe and efficient operation, AMiCUS alpha v.1 provides acoustic and visual feedback to the user. Besides that, special attention has been paid to keeping the complexity of the implemented algorithms low to save resources, such as computing and battery power. The strengths and weaknesses of the system have been assessed during a user study with 25 subjects without motion limitations and 6 tetraplegics with severe motion limitations of the head. Both subjective and objective target quantities have been used. It could be shown that AMiCUS alpha v.1 enables smooth, precise and efficient control of a robot arm to perform simple manipulation tasks. AMiCUS could reliably distinguish between direct control signals and switching commands. Furthermore, no situation has been observed which has put the operational safety to a risk. Overall, user satisfaction was high. However, there were also points of criticism. These have been taken into account during the development of the follow-up version of the demonstrator, AMiCUS alpha v2.0. A tetraplegic exemplarily evaluated AMiCUS alpha v2.0 and compared it to its previous version, AMiCUS alpha v1.0. The direct comparison showed that AMiCUS alpha v2.0 was a subjective improvement compared to AMiCUS alpha v1.0. Most remaining problems, such as difficulties during the imagination of gripper rotations, are likely to be solved by means of learning effects. In a semi-realistic scenario the tetraplegic had the final task to pour water from a bottle into a glass. Like the other tasks, she could also solve this task independently. Overall, the results which could be obtained using the demonstrator AMiCUS are promising. Further development into a stand-alone assistive system or into a complement of a semi-autonomous system is conceivable.


    Zugriff

    Download


    Exportieren, teilen und zitieren



    Titel :

    AMiCUS - Bewegungssensor-basiertes Human-Robot Interface zur intuitiven Echtzeit-Steuerung eines Roboterarmes mit Kopfbewegungen ; AMiCUS - Motion Sensor-based Human-Robot Interface for Intuitive Realtime Control of a Robot Arm Using Head Motion



    Erscheinungsdatum :

    2017-09-14


    Medientyp :

    Hochschulschrift


    Format :

    Elektronische Ressource


    Sprache :

    Deutsch



    Klassifikation :

    DDC:    620 / 629




    Human motion imitation robot

    Thaker, Swapnil Amit | BASE | 2019

    Freier Zugriff

    The human/robot interface

    Wiker, S. F. | British Library Online Contents | 1993


    A Distributed Tactile Sensor for Intuitive Human-Robot Interfacing

    Cirillo, Andrea / Cirillo, Pasquale / De Maria, Giuseppe et al. | BASE | 2017

    Freier Zugriff

    Human–Robot Interaction Based on Motion and Force Control

    Karlsson, Martin | BASE | 2019

    Freier Zugriff