In this paper, we investigate the challenge of Pre-trained Language Models (PLMs) for continual task planning. PLM-based planner is difficult to incorporate incremental experience without risking catastrophic forgetting or overwhelming the model parameters. Inspired by human cognition, we propose the Experience Adapter, a novel method that avoids the need for model re-training or fine-tuning. The adapter continually collects experiences externally, including observation memory and human feedback, represented in memory graph and rules. Using these, the adapter directs task planning and corrects behavior not aligning with human expectations. Our method, not relying on the planner’s inherent structure, pairs easily with various foundational planning methods. In experiments on everyday tasks within the VirtualHome environment, we show that our approach significantly improves task success rate from 47% to 64%. This non-invasive method fits seamlessly within existing model-serving pipelines without altering the model training.
Experience Adapter: Adapting Pre-trained Language Models for Continual Task Planning
Lect.Notes Computer
International Conference on Intelligent Robotics and Applications ; 2023 ; Hangzhou, China July 05, 2023 - July 07, 2023
16.10.2023
12 pages
Aufsatz/Kapitel (Buch)
Elektronische Ressource
Englisch
Task Planning , Pre-trained Language Models , Adapter Computer Science , Artificial Intelligence , Software Engineering/Programming and Operating Systems , Computer Applications , User Interfaces and Human Computer Interaction , Computer Communication Networks , Special Purpose and Application-Based Systems
Europäisches Patentamt | 2020
|Aero-generator mounting adapter and adapting mounting system
Europäisches Patentamt | 2015
|Europäisches Patentamt | 2020
|LLMSTP: Empowering Swarm Task Planning with Large Language Models
Springer Verlag | 2025
|