Percutaneous nephrolithotomy (PCNL) is the current standard of care for patients with a total renal stone burden > 20 mm. Gaining access to the kidney is a crucial step as the position of the percutaneous tract can affect the ability to manipulate a nephroscope during the procedure. However, gaining percutaneous access using fluoroscopic guidance has a challenging learning curve with only a minority of urologists can successfully establish the access. In addition to difficult access, the PCNL carries a risk of bleeding and need for blood transfusion. Robotic assistance may be a key towards accurate and reliable access. Beyond assisting with renal access, a robotic platform can record data of importance related to the user’s activities via sensor-equipped instruments. The analysis of these activities is crucial for understanding what constitutes a successful and safe procedure. In this paper, we harness the powers of machine learning to automatically analyze physicians’ activities during robotic-assisted renal access using the Monarch® Platform, Urology. A machine learning framework based on a combination of a 1-dimensional U-net and random forests was developed to find consistent patterns in the sensor data characteristic of needle insertions. This framework was used to retrospectively analyze data previously obtained from 248 percutaneous renal access procedures. These procedures were performed on 18 human cadaveric models by 17 practicing urologists and one urologist proxy. The framework automatically recognized 94% of all first needle insertions in each procedure and labeled them with the accuracy of 0.81 measured in terms of the Dice coefficient. The recognition accuracy for secondary insertions was 66%. The automatically detected needle insertions were used to calculate clinical metrics such as tract length, anterior-posterior and cranial-caudal angles of the insertion site, as well as user skills such as trajectory deviation and targeting accuracy.


    Zugriff

    Download


    Exportieren, teilen und zitieren



    Titel :

    Deep learning for detection of clinical operations in robot-assisted percutaneous renal access


    Beteiligte:
    Ibragimov, Bulat (Autor:in) / Zhen, Janet (Autor:in) / Ayvali, Elif (Autor:in)

    Erscheinungsdatum :

    2023-01-01


    Anmerkungen:

    Ibragimov , B , Zhen , J & Ayvali , E 2023 , ' Deep learning for detection of clinical operations in robot-assisted percutaneous renal access ' , IEEE Access , vol. 11 , pp. 90358-90366 . https://doi.org/10.1109/ACCESS.2023.3305246



    Medientyp :

    Aufsatz (Zeitschrift)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch



    Klassifikation :

    DDC:    629



    Percutaneous inner-ear access via an image-guided industrial robot system

    Baron, S. / Eilers, H. / Munske, B. et al. | BASE | 2010

    Freier Zugriff


    Posture optimization in robot-assisted machining operations

    Zargarbashi, S.H.H. | Online Contents | 2012


    Tree Trunks Cross-Platform Detection Using Deep Learning Strategies for Forestry Operations

    da Silva, Daniel Queirós / dos Santos, Filipe Neves / Filipe, Vítor et al. | Springer Verlag | 2022