In this paper, we investigate the challenge of Pre-trained Language Models (PLMs) for continual task planning. PLM-based planner is difficult to incorporate incremental experience without risking catastrophic forgetting or overwhelming the model parameters. Inspired by human cognition, we propose the Experience Adapter, a novel method that avoids the need for model re-training or fine-tuning. The adapter continually collects experiences externally, including observation memory and human feedback, represented in memory graph and rules. Using these, the adapter directs task planning and corrects behavior not aligning with human expectations. Our method, not relying on the planner’s inherent structure, pairs easily with various foundational planning methods. In experiments on everyday tasks within the VirtualHome environment, we show that our approach significantly improves task success rate from 47% to 64%. This non-invasive method fits seamlessly within existing model-serving pipelines without altering the model training.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Experience Adapter: Adapting Pre-trained Language Models for Continual Task Planning


    Additional title:

    Lect.Notes Computer


    Contributors:
    Yang, Huayong (editor) / Liu, Honghai (editor) / Zou, Jun (editor) / Yin, Zhouping (editor) / Liu, Lianqing (editor) / Yang, Geng (editor) / Ouyang, Xiaoping (editor) / Wang, Zhiyong (editor) / Zhang, Jiatao (author) / Liao, Jianfeng (author)

    Conference:

    International Conference on Intelligent Robotics and Applications ; 2023 ; Hangzhou, China July 05, 2023 - July 07, 2023



    Publication date :

    2023-10-16


    Size :

    12 pages




    Type of media :

    Article/Chapter (Book)


    Type of material :

    Electronic Resource


    Language :

    English




    Experience Adapter: Adapting Pre-trained Language Models for Continual Task Planning

    Zhang, Jiatao / Liao, Jianfeng / Hu, Tuocheng et al. | TIBKAT | 2023



    Aero-generator mounting adapter and adapting mounting system

    SU HUI / LI GAIQI / JIANG PING et al. | European Patent Office | 2015

    Free access


    LLMSTP: Empowering Swarm Task Planning with Large Language Models

    Yu, Hongbo / Wang, Chang / Wu, Lizhen et al. | Springer Verlag | 2025