In this paper, we propose a camera based novel, realtime framework for detection and tracking of vulnerable road users, such as pedestrians and cyclists. Our framework also gives a measure of the degree of vulnerability based on the direction of movement and distance from the vulnerable region. Pedestrians and cyclists are the most vulnerable road users and it is necessary to develop automated systems that can detect them and ensure their safety by alerting the driver. In our framework, we apply deep learning based method for 2D pose detection for detecting the pedestrians and cyclists in the view of the outside looking camera mounted on the dashboard of a vehicle. As the vehicle moves, the pedestrians and cyclists are detected and tracked across frames, their degree of vulnerability is measured and the driver is alerted in case of high vulnerability score. Experimental results show that the our framework is able to accurately detect vulnerable road users and measure their degree of vulnerability.


    Zugriff

    Zugriff prüfen

    Verfügbarkeit in meiner Bibliothek prüfen

    Bestellung bei Subito €


    Exportieren, teilen und zitieren



    Titel :

    Deep Learning based Vulnerable Road User Detection and Collision Avoidance


    Beteiligte:


    Erscheinungsdatum :

    2018-09-01


    Format / Umfang :

    2953215 byte




    Medientyp :

    Aufsatz (Konferenz)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch



    VULNERABLE ROAD USER (VRU) COLLISION AVOIDANCE SYSTEM

    ELIMALEH YANIV | Europäisches Patentamt | 2023

    Freier Zugriff

    VULNERABLE ROAD USER (VRU) COLLISION AVOIDANCE SYSTEM

    ELIMALEH YANIV | Europäisches Patentamt | 2024

    Freier Zugriff

    Communication-based collision avoidance between vulnerable road users and cars

    Segata, Michele / Vijeikis, Romas / Cigno, Renato Lo | IEEE | 2017



    Research on Vulnerable Road User Detection Algorithm based on Improved Deep Learning

    Sun, Bowen / Lv, Fengyao / Zhou, Guilin et al. | SAE Technical Papers | 2023