You are currently viewing a new version of our website. To view the old version click .
Electronics
  • This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
  • Article
  • Open Access

23 December 2025

A Hierarchical Predictive-Adaptive Control Framework for State-of-Charge Balancing in Mini-Grids Using Deep Reinforcement Learning

,
,
and
1
Department of Computer Science and Engineering, European University of Cyprus, Engomi, Nicosia 2404, Cyprus
2
Department of Computer Science, University of Cyprus, Nicosia 1678, Cyprus
3
CYENS Centre of Excellence, Nicosia 1016, Cyprus
4
Graduate School of Advanced Science and Technology, Japan Advanced Institute of Science and Technology, Nomi 923-1292, Ishikawa, Japan
Electronics2026, 15(1), 61;https://doi.org/10.3390/electronics15010061 
(registering DOI)
This article belongs to the Special Issue Smart Power System Optimization, Operation, and Control

Simple Summary

Mini-grids with multiple battery energy storage systems need intelligent control to keep all batteries at similar states of charge, protect their lifetime, and operate economically. This paper proposes a hierarchical framework that combines a federated Transformer forecasting model with a Soft Actor-Critic reinforcement learning agent to achieve predictive, adaptive, and scalable state-of-charge balancing under high renewable penetration.

Abstract

State-of-charge (SoC) balancing across multiple battery energy storage systems (BESS) is a central challenge in renewable-rich mini-grids. Heterogeneous battery capacities, differing states of health, stochastic renewable generation, and variable loads create a high-dimensional uncertain control problem. Conventional droop-based SoC balancing strategies are decentralized and computationally light but fundamentally reactive and limited, whereas model predictive control (MPC) is insightful but computationally intensive and prone to modeling errors. This paper proposes a Hierarchical Predictive–Adaptive Control (HPAC) framework for SoC balancing in mini-grids using deep reinforcement learning. The framework consists of two synergistic layers operating on different time scales. A long-horizon Predictive Engine, implemented as a federated Transformer network, provides multi-horizon probabilistic forecasts of net load, enabling multiple mini-grids to collaboratively train a high-capacity model without sharing raw data. A fast-timescale Adaptive Controller, implemented as a Soft Actor-Critic (SAC) agent, uses these forecasts to make real-time charge/discharge decisions for each BESS unit. The forecasts are used both to augment the agent’s state representation and to dynamically shape a multi-objective reward function that balances SoC, economic performance, degradation-aware operation, and voltage stability. The paper formulates SoC balancing as a Markov decision process, details the SAC-based control architecture, and presents a comprehensive evaluation using a MATLAB-(R2025a)-based digital-twin simulation environment. A rigorous benchmarking study compares HPAC against fourteen representative controllers spanning rule-based, MPC, and various DRL paradigms. Sensitivity analysis on reward weight selection and ablation studies isolating the contributions of forecasting and dynamic reward shaping are conducted. Stress-test scenarios, including high-volatility net-load conditions and communication impairments, demonstrate the robustness of the approach. Results show that HPAC achieves near-minimal operating cost with essentially zero SoC variance and the lowest voltage variance among all compared controllers, while maintaining moderate energy throughput that implicitly preserves battery lifetime. Finally, the paper discusses a pathway from simulation to hardware-in-the-loop testing and a cloud-edge deployment architecture for practical, real-time deployment in real-world mini-grids.

Article Metrics

Citations

Article Access Statistics

Multiple requests from the same IP address are counted as one view.