Autonomous driving technologies have recently made great strides in development, with several companies and research groups getting close to producing a vehicle with full autonomy. Self-driving cars introduce many advantages, including increased traffic safety and added ride-sharing capabilities which reduce environmental effects. To achieve these benefits, many modules must work together on an autonomous platform to solve the multiple tasks required. One of these tasks is the prediction of the future positions and maneuvers of surrounding human drivers. It is necessary for autonomous driving platforms to be able to reason about, and predict, the future trajectories of other agents in traffic scenarios so that they can ensure their planned maneuvers remain safe and feasible throughout their execution. Due to the stochastic nature of many traffic scenarios, these predictions should also take into account the inherent uncertainty involved, caused by both the road structure and driving styles of human drivers. Since many traffic scenarios include vehicles changing their behavior based on the actions of others, for example by yielding or changing lanes, these interactions should be taken into account to produce more robust predictions. Lastly, the prediction methods should also provide a level of transparency and traceability. On an self-driving platform with many safety-critical tasks, it is important to be able to identify where an error occurred in a failure case, and what caused it. This helps prevent the problem from reoccurring, and can also aid in finding new and relevant test cases for simulation. In this thesis, we present a framework for trajectory prediction of vehicles based on deep learning to fulfill these criteria. We first show that by operating on a generic representation of the traffic scene, our model can implicitly learn interactions between vehicles by capturing the spatio-temporal features in the data using recurrent and convolutional operations, and produce predictions for all vehicles ...


    Zugriff

    Download


    Exportieren, teilen und zitieren



    Titel :

    Interpretable, Interaction-Aware Vehicle Trajectory Prediction with Uncertainty


    Beteiligte:

    Erscheinungsdatum :

    2021-01-01


    Medientyp :

    Hochschulschrift


    Format :

    Elektronische Ressource


    Sprache :

    Englisch



    Klassifikation :

    DDC:    629



    Variational Autoencoder-Based Vehicle Trajectory Prediction with an Interpretable Latent Space

    Neumeier, Marion / Betsch, Michael / Tollkuhn, Andreas et al. | IEEE | 2021


    Interpretable Goal-Based model for Vehicle Trajectory Prediction in Interactive Scenarios

    Ghoul, Amina / Yahiaoui, Itheri / Verroust-Blondet, Anne et al. | IEEE | 2023


    Interaction-Aware Trajectory Prediction with Point Transformer

    Liu, Yahui / Dai, Xingyuan / Fang, Jianwu et al. | IEEE | 2023


    Interpretable Long Term Waypoint-Based Trajectory Prediction Model

    Ghoul, Amina / Yahiaoui, Itheri / Nashashibi, Fawzi | IEEE | 2023