The intricate interactions with other road users and the diversity of traffic environments create a challenging decision-making task for autonomous driving systems. While offline learning solutions are renowned for their high execution efficiency and the ability to approximate the optimal policy across the entire state space, they are often unsafe and fragile when encountering untrained states. Conversely, online planning methods possess the capacity to thoroughly assess how current decisions influence future outcomes online, and therefore, have better generalization. Nevertheless, these approaches face difficulties in terms of execution efficiency and are susceptible to becoming stuck in locally optimal solutions. In this context, this article proposes an integrated planning and learning (IPL) algorithm based on the reinforcement learning (RL) framework for speed and lane change decision-making on highways. Specifically, at each decision time step, this method utilizes the offline learned model to guide an online Monte Carlo Tree Search (MCTS) algorithm for heuristic search, aiming to formulate a forward-looking policy. The experimental results show that the IPL algorithm performs better generalization when faced with unknown scenarios, and its asymptotic performance is better than other benchmark algorithms. In addition, in contrast to the MCTS-based online planning method, the IPL algorithm enhances execution efficiency and comes closer to achieving global optimality.


    Zugriff

    Zugriff prüfen

    Verfügbarkeit in meiner Bibliothek prüfen

    Bestellung bei Subito €


    Exportieren, teilen und zitieren



    Titel :

    Integration of Planning and Deep Reinforcement Learning in Speed and Lane Change Decision-Making for Highway Autonomous Driving


    Beteiligte:
    Zhang, Sunan (Autor:in) / Zhuang, Weichao (Autor:in) / Li, Bingbing (Autor:in) / Li, Ke (Autor:in) / Xia, Tianyu (Autor:in) / Hu, Bo (Autor:in)


    Erscheinungsdatum :

    01.02.2025


    Format / Umfang :

    3069512 byte




    Medientyp :

    Aufsatz (Zeitschrift)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch