To realize a higher-level autonomy of surgical knot tying in minimally invasive surgery (MIS), automated suture grasping, which bridges the suture stitching and looping procedures, is an important yet challenging task needs to be achieved. This paper presents a holistic framework with image-guided and automation techniques to robotize this operation even under complex environments. The whole task is initialized by suture segmentation, in which we propose a novel semi-supervised learning architecture featured with a suture-aware loss to pertinently learn its slender information using both annotated and unannotated data. With successful segmentation in stereo-camera, we develop a Sampling-based Sliding Pairing (SSP) algorithm to online optimize the suture's 3D shape. By jointly studying the robotic configuration and the suture's spatial characteristics, a target function is introduced to find the optimal grasping pose of the surgical tool with Remote Center of Motion (RCM) constraints. To compensate for inherent errors and practical uncertainties, a unified grasping strategy with a novel vision-based mechanism is introduced to autonomously accomplish this grasping task. Our framework is extensively evaluated from learning-based segmentation, 3D reconstruction, and image-guided grasping on the da Vinci Research Kit (dVRK) platform, where we achieve high performances and successful rates in perceptions and robotic manipulations. These results prove the feasibility of our approach in automating the suture grasping task, and this work fills the gap between automated surgical stitching and looping, stepping towards a higher-level of task autonomy in surgical knot tying.


    Zugriff

    Download


    Exportieren, teilen und zitieren



    Titel :

    Toward Image-Guided Automated Suture Grasping Under Complex Environments: A Learning-Enabled and Optimization-Based Holistic Framework


    Beteiligte:
    Lu, Bo (Autor:in) / Li, Bin (Autor:in) / Chen, Wei (Autor:in) / Jin, Yueming (Autor:in) / Zhao, Zixu (Autor:in) / Dou, Qi (Autor:in) / Heng, Pheng-Ann (Autor:in) / Liu, Yunhui (Autor:in)

    Erscheinungsdatum :

    2021-12-29


    Anmerkungen:

    IEEE Transactions on Automation Science and Engineering (2021)


    Medientyp :

    Aufsatz (Zeitschrift)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch



    Klassifikation :

    DDC:    629



    Stereo-Vision-Guided Object Grasping

    Nguyen, M.-C. / Graefe, V. / International Symposium on Automotive Technology and Automation | British Library Conference Proceedings | 1999


    Stereo-vision-guided object grasping

    Nguyen,M.C. / Graefe,V. / Univ.d.Bundeswehr Muenchen,DE | Kraftfahrwesen | 1999


    Autonomous vision-guided bi-manual grasping and manipulation

    Rastegarpanah, Alireza | BASE | 2017

    Freier Zugriff


    Eye in hand robot arm based automated object grasping system

    Ishak, Asnor Juraiza / Mahmood, Sarmad Nozad | BASE | 2019

    Freier Zugriff