As one of adaptive optimal controls, the Q-learning based supervisory control for hybrid electric vehicle (HEV) energy management is rarely studied for its adaptability. In real-world driving scenarios, conditions such as vehicle loads, road conditions and traffic conditions may vary. If these changes occur and the vehicle supervisory control does not adapt to it, the resulting fuel economy may not be optimal. To our best knowledge, for the first time, the study investigates the adaptability of Q-learning based supervisory control for HEVs. A comprehensive analysis is presented for the adaptability interpretation with three varying factors: driving cycle, vehicle load condition, and road grade. A parallel HEV architecture is considered and Q-learning is used as the reinforcement learning algorithm to control the torque split between the engine and the electric motor. Model Predictive Control, Equivalent consumption minimization strategy and thermostatic control strategy are implemented for comparison. The Q-learning based supervisory control shows strong adaptability under different conditions, and it leads the fuel economy among four supervisory controls in all three varying conditions.
Q-Learning-Based Supervisory Control Adaptability Investigation for Hybrid Electric Vehicles
IEEE Transactions on Intelligent Transportation Systems ; 23 , 7 ; 6797-6806
2022-07-01
3285749 byte
Article (Journal)
Electronic Resource
English
Supervisory robust control of hybrid electric vehicles
Tema Archive | 2003
|Model Based Optimization of Supervisory Control Parameters for Hybrid Electric Vehicles
SAE Technical Papers | 2008
|Model Based Optimization of Supervisory Control Parameters for Hybrid Electric Vehicles
British Library Conference Proceedings | 2008
|A Novel Supervisory Control and Analysis Approach for Hybrid Electric Vehicles
British Library Conference Proceedings | 2020
|