Volume 03: Proceedings of 11th International Conference on Applied Energy, Part 2, Sweden, 2019

Vision-Aided Deep Reinforcement Learning for Energy Management of Hybrid Electric Vehicles Yong Wang, Yuankai Wu, Jiankun Peng, Huachun Tan, Dechong Zeng, Hongwen He


This paper introduces an energy management strategy that combines visual perception and deep reinforcement learning (DRL) algorithms to minimize fuel consumption. The proposed method is capable of autonomously learning the optimal control policy without any prediction efforts. We used a monocular camera in the windshield of a car to catch visual information as inputs. Next, we used state-of-the-art convolutional neural networks based object detection methods to detect and classify traffic light. The traffic light information is used as a state input for a model-free deep reinforcement learning based energy management system with continuous control action. Hence, the traffic light information is incorporated into the energy management system. The experimental results indicate that the fuel economy of the proposed vision-aided strategy achieves 94.5% of dynamic programming-based method’s, and is 6.8% better than that of the original DRL algorithm without traffic light information under a real-world driving cycle.

Keywords hybrid electric vehicle, energy management strategy, visual perception, deep reinforcement learning, traffic light

Copyright ©
Energy Proceedings