The perception of autonomous driving is essential for vehicle localization and route planning. However, relying solely on vehicle sensors makes it challenging to achieve reliable and all-around perception. In this paper, we propose a method for constructing a scene situation map based on vehicle-road coordination. Firstly, we propose a panoramic camera and LiDAR fusion detection method located on the roadside to extract dynamic targets in the road environment. Then, on the vehicle side, we detect and track dynamic targets and associate them with the roadside by using dynamic target trajectory matching. Finally, we perform a graph optimization model between the vehicle, road, and HD map. By mapping the optimized dynamic targets onto a high-precision map, we generate a scene situation map. The effectiveness of the proposed method is validated through benchmark testing and a real-world field test.


    Zugriff

    Zugriff prüfen

    Verfügbarkeit in meiner Bibliothek prüfen

    Bestellung bei Subito €


    Exportieren, teilen und zitieren



    Titel :

    Enhancing Autonomous Driving Through Collaborative Perception and Scene Situation Map Construction


    Beteiligte:
    Zhang, Zufeng (Autor:in) / Yin, Jialun (Autor:in) / Tao, Qianwen (Autor:in) / Lu, Weike (Autor:in) / Zhang, Xuefeng (Autor:in)


    Erscheinungsdatum :

    24.09.2024


    Format / Umfang :

    8119383 byte





    Medientyp :

    Aufsatz (Konferenz)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch




    Collaborative Perception Datasets in Autonomous Driving: A Survey

    Yazgan, Melih / Akkanapragada, Mythra Varun / Marius Zollner, J. | IEEE | 2024



    Enhancing Scene Simulation for Autonomous Driving with Neural Point Rendering

    Yang, Junqing / Yan, Yuxi / Chen, Shitao et al. | IEEE | 2023