Spatial-temporal scene graphs (STSG) are emerging for motion prediction in autonomous driving. Existing work focuses on the graph structure and corresponding graph neural network models, ignoring the challenge of constructing STSG on real-world autonomous vehicles. In this paper, we propose a method for robustly constructing STSG against perception failures that may occur in real-running vehicles. We first propose an object-oriented lifecycle management module to identify abnormal nodes by scoring to deal with the possible missed detection and false detection in perception. Then we employ Kalman filter to predict the state of the missing nodes to complete the lost information, and develop a novel bipartite graph matching strategy based on the Kuhn-Munkres algorithm to re-match the abnormal nodes. Experimental results on public datasets show that our proposed method can effectively correct possible errors in raw perception results, thereby improving the stability and reliability of the constructed STSG.


    Zugriff

    Zugriff prüfen

    Verfügbarkeit in meiner Bibliothek prüfen

    Bestellung bei Subito €


    Exportieren, teilen und zitieren



    Titel :

    Robust Construction of Spatial-Temporal Scene Graph Considering Perception Failures for Autonomous Driving


    Beteiligte:
    Li, Yongwei (Autor:in) / Song, Tao (Autor:in) / Wu, Xinkai (Autor:in)


    Erscheinungsdatum :

    24.09.2023


    Format / Umfang :

    747141 byte





    Medientyp :

    Aufsatz (Konferenz)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch



    Scene-Graph Embedding for Robust Autonomous Vehicle Perception

    Yu, Shih-Yuan / Malawade, Arnav Vaibhav / Faruque, Mohammad Abdullah Al | Springer Verlag | 2023



    Trajectory prediction for autonomous driving based on multiscale spatial‐temporal graph

    Tang, Luqi / Yan, Fuwu / Zou, Bin et al. | Wiley | 2023

    Freier Zugriff

    Trajectory prediction for autonomous driving based on multiscale spatial‐temporal graph

    Luqi Tang / Fuwu Yan / Bin Zou et al. | DOAJ | 2023

    Freier Zugriff