Robust Energy Management Policies for Solar Microgrids via Reinforcement Learning
Abstract
:1. Introduction
1.1. Literature Review
1.1.1. Common Optimization Objectives
1.1.2. Microgrid Modeling Methods
1.1.3. Energy Management Policy Solution Strategies
1.1.4. Intellectual Contribution
2. Methodology
2.1. Modeling the Microgrid Environment
Stochastic PV Output and Energy Demand Modeling
2.2. EMP as a Markov Decision Process
2.3. A2C and PPO Reinforcement Learning Optimal EMS
2.3.1. EMP as an RL Formulation
2.3.2. Training and Tuning the RL Networks
2.4. Baseline Comparison Methods
Algorithm 1: Demand Greedy EMP Algorithm |
3. Problem Description and Formulation
3.1. Analytic Methods
3.2. Model Configuration
4. Results Discussion
4.1. Grid Connected Cost Coverage and Demand Coverage
4.2. Island Mode Performance
4.3. Intermittent Disruptions
4.4. Analysis and Comparison of RL EMS Policies
5. Conclusions, Limitations, and Future Work
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Abbreviations
DN | energy demand nodes or facilities |
DG | distributed generators |
ESS | energy storage system |
EMP | energy management problem |
EMA | energy management agent |
EMS | energy management strategy |
PV | photovoltaic generator |
PVDG | photovoltaic distributed generator |
PVDG/ESS | co-located PVDG and ESS |
MG | microgrid |
DGMG | distributed generation microgrid |
ODGSP | optimal distributed generation sizing and placement |
RPF | reverse power flow |
RES | renewable energy source |
References
- Evans, A.; Strezov, V.; Evans, T.J. Assessment of utility energy storage options for increased renewable energy penetration. Renew. Sustain. Energy Rev. 2012, 16, 4141–4147. [Google Scholar] [CrossRef]
- Mardani, A.; Zavadskas, E.K.; Khalifah, Z.; Zakuan, N.; Jusoh, A.; Nor, K.M.; Khoshnoudi, M. A review of multi-criteria decision-making applications to solve energy management problems: Two decades from 1995 to 2015. Renew. Sustain. Energy Rev. 2017, 71, 216–256. [Google Scholar] [CrossRef]
- Das, C.K.; Bass, O.; Kothapalli, G.; Mahmoud, T.S.; Habibi, D. Overview Of Energy Storage Systems in Distribution Networks: Placement, Sizing, Operation, And Power Quality. Renew. Sustain. Energy Rev. 2018, 91, 1205–1230. [Google Scholar] [CrossRef]
- NREL. Distributed Solar PV for Electricity System Resiliency Policy and Regulatory Considerations. 2014. Available online: https://www.nrel.gov (accessed on 1 March 2024).
- Kenward, A.; Raja, U. Blackout Extreme Weather, Climate Change And Power Outages. Clim. Cent. 2014, 10, 1–23. [Google Scholar]
- Stenstadvolden, A.; Hansen, L.; Zhao, L.; Kapourchali, M.H.; Lee, W.J. Demand and Sustainability Analysis for A Level-3 Charging Station on the U.S. Highway Based on Actual Smart Meter Data. IEEE Trans. Ind. Appl. 2024, 60, 1310–1321. [Google Scholar] [CrossRef]
- Lee, S.; Choi, D.H. Reinforcement Learning-Based Energy Management of Smart Home with Rooftop Solar Photovoltaic System, Energy Storage System, and Home Appliances. Sensors 2019, 19, 3937. [Google Scholar] [CrossRef] [PubMed]
- Hosseini, S.M.; Carli, R.; Dotoli, M. Robust Optimal Energy Management of a Residential Microgrid Under Uncertainties on Demand and Renewable Power Generation. IEEE Trans. Autom. Sci. Eng. 2021, 18, 618–637. [Google Scholar] [CrossRef]
- Ali, S.; Khan, I.; Jan, S.; Hafeez, G. An Optimization Based Power Usage Scheduling Strategy Using Photovoltaic-Battery System for Demand-Side Management in Smart Grid. Energies 2021, 14, 2201. [Google Scholar] [CrossRef]
- He, X.; Ge, S.; Liu, H.; Xu, Z.; Mi, Y.; Wang, C. Frequency regulation of multi-microgrid with shared energy storage based on deep reinforcement learning. Electr. Power Syst. Res. 2023, 214, 108962. [Google Scholar] [CrossRef]
- Gilani, M.A.; Kazemi, A.; Ghasemi, M. Distribution system resilience enhancement by microgrid formation considering distributed energy resources. Energy 2020, 191, 116442. [Google Scholar] [CrossRef]
- Kuznetsova, E.; Li, Y.F.; Ruiz, C.; Zio, E.; Ault, G.; Bell, K. Reinforcement learning for microgrid energy management. Energy 2013, 59, 133–146. [Google Scholar] [CrossRef]
- Alhasnawi, B.N.; Jasim, B.H.; Sedhom, B.E.; Hossain, E.; Guerrero, J.M. A New Decentralized Control Strategy of Microgrids in the Internet of Energy Paradigm. Energies 2021, 14, 2183. [Google Scholar] [CrossRef]
- Cao, J.; Harrold, D.; Fan, Z.; Morstyn, T.; Healey, D.; Li, K. Deep Reinforcement Learning-Based Energy Storage Arbitrage With Accurate Lithium-Ion Battery Degradation Model. IEEE Trans. Smart Grid 2020, 11, 4513–4521. [Google Scholar] [CrossRef]
- Lan, T.; Jermsittiparsert, K.T.; Alrashood, S.; Rezaei, M.; Al-Ghussain, L.A.; Mohamed, M. An Advanced Machine Learning Based Energy Management of Renewable Microgrids Considering Hybrid Electric Vehicles’ Charging Demand. Energies 2021, 14, 569. [Google Scholar] [CrossRef]
- Suanpang, P.; Jamjuntr, P.; Jermsittiparsert, K.; Kaewyong, P. Autonomous Energy Management by Applying Deep Q-Learning to Enhance Sustainability in Smart Tourism Cities. Energies 2022, 15, 1906. [Google Scholar] [CrossRef]
- Quynh, N.V.; Ali, Z.M.; Alhaider, M.M.; Rezvani, A.; Suzuki, K. Optimal energy management strategy for a renewable-based microgrid considering sizing of battery energy storage with control policies. Int. J. Energy Res. 2021, 45, 5766–5780. [Google Scholar] [CrossRef]
- Li, H.; Eseye, A.T.; Zhang, J.; Zheng, D. Optimal energy management for industrial microgrids with high-penetration renewables. Prot. Control Mod. Power Syst. 2017, 2, 1–14. [Google Scholar] [CrossRef]
- Eseye, A.T.; Zheng, D.; Zhang, J.; Wei, D. Optimal energy management strategy for an isolated industrial microgrid using a Modified Particle Swarm Optimization. In Proceedings of the 2016 IEEE International Conference on Power and Renewable Energy (ICPRE), Shanghai, China, 21–23 October 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 494–498. [Google Scholar]
- Contreras, J.; Losi, A.; Russo, M.; Wu, F. Simulation and evaluation of optimization problem solutions in distributed energy management systems. IEEE Trans. Power Syst. 2002, 17, 57–62. [Google Scholar] [CrossRef]
- Tenfen, D.; Finardi, E.C. A mixed integer linear programming model for the energy management problem of microgrids. Electr. Power Syst. Res. 2015, 122, 19–28. [Google Scholar] [CrossRef]
- Tabar, V.S.; Abbasi, V. Energy management in microgrid with considering high penetration of renewable resources and surplus power generation problem. Energy 2019, 189, 116264. [Google Scholar] [CrossRef]
- Singh, U.; Rizwan, M.; Alaraj, M.; Alsaidan, I. A Machine Learning-Based Gradient Boosting Regression Approach for Wind Power Production Forecasting: A Step towards Smart Grid Environments. Energies 2021, 14, 5196. [Google Scholar] [CrossRef]
- Zhang, B.; Hu, W.; Xu, X.; Li, T.; Zhang, Z.; Chen, Z. Physical-model-free intelligent energy management for a grid-connected hybrid wind-microturbine-PV-EV energy system via deep reinforcement learning approach. Renew. Energy 2022, 200, 433–448. [Google Scholar] [CrossRef]
- Gazafroudi, A.S.; Shafie-khah, M.; Heydarian-Forushani, E.; Hajizadeh, A.; Heidari, A.; Corchado, J.M.; Catalão, J.P. Two-stage stochastic model for the price-based domestic energy management problem. Int. J. Electr. Power Energy Syst. 2019, 112, 404–416. [Google Scholar] [CrossRef]
- Guo, C.; Wang, X.; Zheng, Y.; Zhang, F. Real-time optimal energy management of microgrid with uncertainties based on deep reinforcement learning. Energy 2022, 238, 121873. [Google Scholar] [CrossRef]
- Khawaja, Y.; Qiqieh, I.; Alzubi, J.; Alzubi, O.; Allahham, A.; Giaouris, D. Design of cost-based sizing and energy management framework for standalone microgrid using reinforcement learning. Sol. Energy 2023, 251, 249–260. [Google Scholar] [CrossRef]
- Jiao, F.; Zou, Y.; Zhou, Y.; Zhang, Y.; Zhang, X. Energy management for regional microgrids considering energy transmission of electric vehicles between microgrids. Energy 2023, 283, 128410. [Google Scholar] [CrossRef]
- Lai, B.C.; Chiu, W.Y.; Tsai, Y.P. Multiagent Reinforcement Learning for Community Energy Management to Mitigate Peak Rebounds Under Renewable Energy Uncertainty. IEEE Trans. Emerg. Top. Comput. Intell. 2022, 6, 568–579. [Google Scholar] [CrossRef]
- NREL (National Renewable Energy Lab). Regional PV 4 Minute Output Data. 2021. Available online: https://www.nrel.gov/grid/solar-power-data.html (accessed on 1 March 2024).
- NREL (National Renewable Energy Lab). Commercial and Residential Hourly Load Profiles for All TMY3 Locations in the United States. 2021. Available online: https://catalog.data.gov/dataset/commercial-and-residential-hourly-load-profiles-for-all-tmy3-locations-in-the-united-state-bbc75#sec-dates (accessed on 1 March 2024).
Sets | Description |
---|---|
set of facilities to be energized | |
distributed generation unit | |
set of potential actions | |
states representing ESS charge level {0 (empty), ..., N (full)} | |
set of months the EMS policy will be evaluated | |
time units representing the hour of a day |
Variables | Description |
---|---|
expected PV output at time h based and m | |
ESS output at time h based on action a | |
number of consecutive discharges with no charge at h | |
demand for facility f over hour h | |
load demand by over h | |
day of month | |
power supply from over h | |
difference between energy demanded and energy supply | |
cost or profit for energy bought/sold at time h | |
energy cost/burden at time h |
Parameter | Description |
---|---|
peak demand energy price USD/kWh) | |
ESS nameplate power rating (kWh) | |
maximum consecutive discharges ESS can provide from a full charge | |
PV nameplate power rating (kW) | |
penalty for exceeding |
Method | Cost Coverage | Demand Coverage |
---|---|---|
RNG | 44% | 47% |
Greedy | 59% | 53% |
PPO-G | 86% | 61% |
A2C-G | 83% | 62% |
PPO-D | 69% | 53% |
A2C-D | 51% | 52% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Jones, G.; Li, X.; Sun, Y. Robust Energy Management Policies for Solar Microgrids via Reinforcement Learning. Energies 2024, 17, 2821. https://doi.org/10.3390/en17122821
Jones G, Li X, Sun Y. Robust Energy Management Policies for Solar Microgrids via Reinforcement Learning. Energies. 2024; 17(12):2821. https://doi.org/10.3390/en17122821
Chicago/Turabian StyleJones, Gerald, Xueping Li, and Yulin Sun. 2024. "Robust Energy Management Policies for Solar Microgrids via Reinforcement Learning" Energies 17, no. 12: 2821. https://doi.org/10.3390/en17122821
APA StyleJones, G., Li, X., & Sun, Y. (2024). Robust Energy Management Policies for Solar Microgrids via Reinforcement Learning. Energies, 17(12), 2821. https://doi.org/10.3390/en17122821