This study investigates economical scheduling of charging for an electric vehicle (EV) in a typical household with an intelligent charging management system. The problem formulation considers rooftop solar power generation, time-varying domestic energy consumption, real-time pricing of electricity, and user preferences. This task traditionally takes the form of a mixed-integer linear programming (MILP) problem, but we demonstrate its equivalence to linear programming (LP) to reduce computational complexity. The LP problem can be solved to global optimality if all future information is known, which is unrealistic in practice and replaced with forecasting. Learning-based methods such as deep reinforcement learning (DRL) eliminate the need for a forecaster and make online decisions rapidly using a learned policy. We propose an approach based on imitation learning that leverages the knowledge of an LP expert by learning from its optimal demonstrations instead of learning from scratch in DRL. Our approach trains a deep neural network (DNN) based policy efficiently in a supervised manner and incorporates a safety post-processing mechanism that enforces strict constraint satisfaction. Numerical studies on real-world data show that the proposed approach achieves $23~ \sim ~220$ times speedup compared to DRL for DNN training, and the total electricity cost is far lower than DRL as well, which is strikingly close to the lower bound in theory. Our implementation code can be found at https://github.com/ZhenhaoH/IL_EVCS.
Economical Electric Vehicle Charging Scheduling via Deep Imitation Learning
IEEE Transactions on Intelligent Transportation Systems ; 25 , 11 ; 18196-18210
01.11.2024
2825292 byte
Aufsatz (Zeitschrift)
Elektronische Ressource
Englisch
Economical charging and discharging method for storage battery of electric vehicle
Europäisches Patentamt | 2021
|