In this paper, we investigate the challenge of Pre-trained Language Models (PLMs) for continual task planning. PLM-based planner is difficult to incorporate incremental experience without risking catastrophic forgetting or overwhelming the model parameters. Inspired by human cognition, we propose the Experience Adapter, a novel method that avoids the need for model re-training or fine-tuning. The adapter continually collects experiences externally, including observation memory and human feedback, represented in memory graph and rules. Using these, the adapter directs task planning and corrects behavior not aligning with human expectations. Our method, not relying on the planner’s inherent structure, pairs easily with various foundational planning methods. In experiments on everyday tasks within the VirtualHome environment, we show that our approach significantly improves task success rate from 47% to 64%. This non-invasive method fits seamlessly within existing model-serving pipelines without altering the model training.


    Zugriff

    Zugriff prüfen

    Verfügbarkeit in meiner Bibliothek prüfen

    Bestellung bei Subito €


    Exportieren, teilen und zitieren



    Titel :

    Experience Adapter: Adapting Pre-trained Language Models for Continual Task Planning


    Weitere Titelangaben:

    Lect.Notes Computer


    Beteiligte:
    Yang, Huayong (Herausgeber:in) / Liu, Honghai (Herausgeber:in) / Zou, Jun (Herausgeber:in) / Yin, Zhouping (Herausgeber:in) / Liu, Lianqing (Herausgeber:in) / Yang, Geng (Herausgeber:in) / Ouyang, Xiaoping (Herausgeber:in) / Wang, Zhiyong (Herausgeber:in) / Zhang, Jiatao (Autor:in) / Liao, Jianfeng (Autor:in)

    Kongress:

    International Conference on Intelligent Robotics and Applications ; 2023 ; Hangzhou, China July 05, 2023 - July 07, 2023



    Erscheinungsdatum :

    16.10.2023


    Format / Umfang :

    12 pages




    Medientyp :

    Aufsatz/Kapitel (Buch)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch




    Experience Adapter: Adapting Pre-trained Language Models for Continual Task Planning

    Zhang, Jiatao / Liao, Jianfeng / Hu, Tuocheng et al. | TIBKAT | 2023


    Adapter part for adapting a coupling head, coupling head having an adapter part and method for adapting a coupling head

    JURSS DOMINIK / KONTETZKI ARTHUR | Europäisches Patentamt | 2020

    Freier Zugriff

    Aero-generator mounting adapter and adapting mounting system

    SU HUI / LI GAIQI / JIANG PING et al. | Europäisches Patentamt | 2015

    Freier Zugriff


    LLMSTP: Empowering Swarm Task Planning with Large Language Models

    Yu, Hongbo / Wang, Chang / Wu, Lizhen et al. | Springer Verlag | 2025