Abstract
As an increasing number of areas are turning to renewable energy sources to meet the growing energy demand of buildings, the intermittent generation times of renewable energies present a significant challenge. These sources often fail to provide sufficient energy during peak consumption periods. Vehicle-to-Building systems (V2B), serving as flexible energy storage solutions within buildings, have the capability to overcome these intermittency issues. This work focuses on applying deep reinforcement learning (DRL) to control the complex building energy management system. Our algorithm leverages historical photovoltaic data, building energy consumption profiles, and the State of Charge (SOC) along with the entry and exit times of electric vehicles. It strategically sets the charging and discharging power of each EV in real-time to optimize energy usage and manage the unpredictability of renewable energy. We integrated the reinforcement learning framework with a city-level energy simulation platform and conducted on various urban forms in Shenzhen, China as case studies. A series of experiments were carried out, demonstrating the effectiveness and practicality of our approach compared with Model Predictive Control (MPC) methods in peak shaving and load leveling.
Keywords deep reinforcement learning, peak shaving, vehicle-to-building, energy management system
Copyright ©
Energy Proceedings