Next Article in Journal
Process–Structure–Property Correlations in Twin-Screw Extrusion of Graphitic Negative Electrode Pastes for Lithium Ion Batteries Focusing on Kneading Concentrations
Previous Article in Journal
Ultra-Stable Anode-Free Na Metal Batteries Enabled by Al2O3-Functionalized Separators
Previous Article in Special Issue
Estimation of SOC in Lithium-Iron-Phosphate Batteries Using an Adaptive Sliding Mode Observer with Simplified Hysteresis Model during Electric Vehicle Duty Cycles
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Artificial Intelligence and Digital Twin Technologies for Intelligent Lithium-Ion Battery Management Systems: A Comprehensive Review of State Estimation, Lifecycle Optimization, and Cloud-Edge Integration

1
Department of Chemical Engineering, University of Waterloo, Waterloo, ON N2L 3G1, Canada
2
Intelligent Robotic and Energy Systems (IRES), Department of Electronics, Carleton University, Ottawa, ON K1S 5B6, Canada
3
School of Engineering, Swinburne University of Technology, Melbourne, VIC 3122, Australia
4
Institute for Superconducting & Electronic Materials (ISEM), Australian Institute for Innovative Materials (AIIM), University of Wollongong, Wollongong, NSW 2500, Australia
5
Institute of Energy Materials Science, University of Shanghai for Science and Technology, Shanghai 200093, China
6
China Coal Research Institute, CCTEG, Beijing 100013, China
*
Author to whom correspondence should be addressed.
Batteries 2025, 11(8), 298; https://doi.org/10.3390/batteries11080298
Submission received: 11 June 2025 / Revised: 21 July 2025 / Accepted: 28 July 2025 / Published: 5 August 2025

Abstract

The rapid growth of electric vehicles (EVs) and new energy systems has put lithium-ion batteries at the center of the clean energy change. Nevertheless, to achieve the best battery performance, safety, and sustainability in many changing circumstances, major innovations are needed in Battery Management Systems (BMS). This review paper explores how artificial intelligence (AI) and digital twin (DT) technologies can be integrated to enable the intelligent BMS of the future. It investigates how powerful data approaches such as deep learning, ensembles, and models that rely on physics improve the accuracy of predicting state of charge (SOC), state of health (SOH), and remaining useful life (RUL). Additionally, the paper reviews progress in AI features for cooling, fast charging, fault detection, and intelligible AI models. Working together, cloud and edge computing technology with DTs means better diagnostics, predictive support, and improved management for any use of EVs, stored energy, and recycling. The review underlines recent successes in AI-driven material research, renewable battery production, and plans for used systems, along with new problems in cybersecurity, combining data and mass rollout. We spotlight important research themes, existing problems, and future drawbacks following careful analysis of different up-to-date approaches and systems. Uniting physical modeling with AI-based analytics on cloud-edge-DT platforms supports the development of tough, intelligent, and ecologically responsible batteries that line up with future mobility and wider use of renewable energy.

1. Introduction

The electrification of transportation has emerged as a pivotal strategy in the global effort to reduce greenhouse gas emissions, accelerate the transition to sustainable energy systems, and achieve ambitious climate goals [1,2]. At the heart of this transformation lies lithium-ion batteries, which power the majority of electric vehicles (EVs) due to their high energy density, lifecycle characteristics, and efficiency. However, the growing complexity of battery systems has introduced significant challenges across multiple domains, including safety, reliability, cost, and end-of-life sustainability. To ensure optimal battery performance and longevity under varying operational conditions, BMSs play a pivotal role by enabling real-time monitoring, control, and protection of battery packs. Despite their importance, conventional BMS implementations often rely on resource-constrained microcontrollers that limit the integration of advanced algorithms and real-time analytics [3,4].
In response to these limitations, recent research has advanced both model-based and data-driven techniques for BMS optimization. Equivalent circuit models (ECMs), electrochemical models, and hybrid frameworks have been widely employed for battery behavior representation and control, while machine learning approaches such as support vector regression, random forests, and deep learning have shown considerable promise in improving state estimation accuracy, particularly for SOC, state of health (SOH), and remaining useful life (RUL) prediction [5,6,7]. Hybrid approaches combining signal decomposition with optimized neural networks have achieved high prediction fidelity even under complex degradation dynamics [8]. Physics-informed neural networks (PINNs) have demonstrated the potential to unify mechanistic insights with data-driven learning for generalizable battery aging models [9]. Simultaneously, significant progress has been made in battery thermal management systems (BTMSs), where innovative combinations of phase change materials (PCMs), porous copper structures, and liquid cooling strategies have led to notable reductions in temperature gradients and enhanced energy density without compromising safety [10,11,12].
Fast charging, an essential enabler for the mass adoption of EVs, imposes additional thermal and electrochemical stress on batteries. AI-enabled charging strategies incorporating digital twins (DTs), reinforcement learning, and physics-based control architectures have emerged as viable solutions to mitigate degradation, improve adaptability, and enhance safety during high-rate charging. Among these, digital twin technology has gained particular momentum as a next-generation framework that integrates physical modeling, real-time sensing, internet-of-things (IoT) connectivity, and cloud/edge computing to create dynamic, bidirectional virtual representations of battery systems. DTs enable predictive diagnostics, adaptive control, and intelligent lifecycle management, with applications spanning battery design, manufacturing, repurposing, and recycling [13,14].
The integration of DTs with AI also opens new opportunities for sustainability and circular economy practices. Advanced data analytics and intelligent decision-making frameworks are now being employed to optimize recycling processes for ternary cathode materials such as nickel, cobalt, and manganese, addressing critical issues of resource scarcity and environmental impact [15]. Nevertheless, the full realization of intelligent and sustainable battery systems requires overcoming several persistent barriers, including cybersecurity concerns, the lack of standardization in DT architecture, limited dataset availability, and the need for interpretable, scalable AI models. Addressing these gaps is essential to developing future-proof BMS architectures that are resilient, adaptive, and capable of supporting the evolving demands of electrified transportation.

2. Battery Management Systems

A Lithium-ion BMS oversees all important signals, including cell voltage, current, temperature, and SOC, to guarantee safe, efficient, and reliable battery operation. The main advantages include fault protection, management of the battery temperature, keeping battery cells at the same charge, and monitoring how healthy the battery is. Modern BMSs now use AI for predictions, digital twins for simulation, and cloud networks to provide live analysis from any location. Electric vehicles, e-bikes, grid storage, and portable electronics often include these systems to keep batteries functioning longer, more safely, and with better performance.
A BMS can be seen acting in multiple ways and being updated with new technology, as Figure 1 shows. BMS systems are used throughout different industries, as highlighted in Figure 1a, such as electric vehicles, telecommunications, aerospace, and robotics, to maintain safety, improve efficiency, and establish reliability. Figure 1b demonstrates how bringing in big data platforms, using hybrid algorithms, and improved sensor technologies is improving battery monitoring and control in future BMS. Taken together, these images prove the importance of BMS in current energy and transportation systems and the path toward using data and intelligence.

3. AI Applications in Lithium-Ion Batteries

As Figure 2 illustrates, different kinds of data from lithium-ion batteries, such as voltage, current, temperature, and pressure, are analyzed with several methods to identify the battery’s state. The methods reveal if devices are facing problems related to their capacity decreasing, resistance increasing, or material failing. The data become particularly significant when used in AI-based battery systems. Using the health measurements, machine learning models can determine the pattern of a battery’s aging and forecast how well it will do in the future. Applying AI-supported data analysis allows us to design a smart BMS for electric vehicles, storage, and other uses, so that issues are spotted early, battery life is increased, and operations are both safe and efficient.

3.1. AI-Powered Lithium-Ion Battery Management Systems

Artificial intelligence is playing an increasingly vital role in enhancing the functionality, safety, and longevity of lithium-ion batteries within EV applications. One of the most comprehensive assessments in this domain was conducted by Lipu et al. [16], who statistically analyzed 78 publications spanning from 2014 to 2023. Their work identified key research trends, leading contributors, and dominant algorithmic strategies, while outlining the strengths and limitations of various AI techniques applied in BMSs. Their analysis underscores the transformative potential of AI in boosting electric vehicle performance and battery lifecycle, thus aligning with broader goals of decarbonization and sustainable energy systems.
Yavas et al. [17] explored the capabilities of AI-based approaches for enhancing real-time monitoring of state of charge (SOC) and state of health (SOH), as well as for predicting faults and managing thermal events. Their findings demonstrate that AI-enabled BMS outperforms traditional systems by dynamically adapting charging protocols and reducing operational risks. Similarly, Pooyandeh et al. [18] proposed an AI-empowered digital twin framework that leverages a time-series generative adversarial network (TS-GAN) to synthesize accurate SOC profiles, showing improvements in energy efficiency, operational safety, and battery lifespan. Palanichamy et al. [19] introduced an AI-based BMS architecture incorporating X-ray computed tomography and electrochemical impedance spectroscopy to enable model-free predictions of SOC, SOH, and fault dynamics, enhancing fault detection sensitivity by 40% and reducing thermal risk.
Farman et al. [20] emphasized the importance of dynamic SOH and RUL estimation under real-world operating conditions. By integrating cloud connectivity and IoT-based sensing, their framework supports robust thermal management and fault diagnostics, promoting battery reliability and long-term sustainability. Ahwiadi and Wang [21] further advanced the prognostic capabilities of BMS through an AI-driven particle filter (AI-PF) algorithm that employs dynamic degeneracy detection and adaptive mutation to maintain particle diversity. Their method achieved up to 33% reduction in root mean square errors (RMSE) and RUL prediction errors as low as 4.94%, with improved computational efficiency suitable for real-time application.
Devendra et al. [22] explored machine learning regressors such as XGBoost, LightGBM, and SVM, highlighting the predictive power of ensemble models under variable load profiles. Reinforcement learning strategies have also gained traction; Suanpang and Jamjuntr [23] implemented a Q-learning-based BMS optimization protocol that improved energy efficiency by 15% and extended battery lifespan by 20%.
In the broader context of energy systems, Razmjoo et al. [24] underscored the critical role of energy storage systems (ESS) in enabling large-scale renewable energy integration, highlighting the synergy between battery storage, pumped hydro, and AI optimization for grid flexibility and emission reduction. Their study emphasized that widespread ESS adoption hinges not only on technological innovation but also on supportive regulatory frameworks, strategic financial incentives, and collaborative policy actions.
Miraftabzadeh et al. [25] reviewed hybrid AI models, including artificial neural networks (ANNs), fuzzy logic, and optimization algorithms that deliver high-accuracy estimations of battery health and power quality. Deep learning and reinforcement learning were identified as underutilized yet promising approaches for advancing adaptive control and facilitating renewable integration within electric mobility ecosystems.
Badran and Toha [26] provided a broader perspective by analyzing supervised machine learning techniques across classification and regression tasks in BMS. They emphasized the increasing relevance of IoT, cloud computing, and edge control modules for real-time system optimization, while pointing to persistent challenges such as inadequate modeling of environmental influences and the need for balanced cloud-onboard computation strategies.
Finally, Acharya et al. [27] examined how AI and machine learning are transforming the battery value chain from material discovery to end-of-life recycling. They illustrated how AI tools are accelerating the identification of high-energy-density electrode materials and electrolyte formulations, while enabling sophisticated SOH analysis and imaging-based diagnostics. Furthermore, their work explored AI-driven automation in manufacturing and blockchain-enabled traceability, contributing to enhanced sustainability in battery recycling.

3.2. Artificial Intelligence for Lithium-Ion Battery Optimization

Artificial intelligence has emerged as a pivotal tool in optimizing lithium-ion battery performance, with diverse applications spanning state estimation, material discovery, system control, and real-time prediction. Among recent developments, Oyucu et al. [28] demonstrated the efficacy of machine learning models, particularly LightGBM, in predicting discharge capacity with exceptional precision (MAE: 0.103, MSE: 0.019, R2: 0.887). Their integration of explainable AI (XAI) using SHAP analysis identified temperature as the most influential factor, directly correlating elevated thermal conditions with capacity loss. These insights offer valuable direction for improving charge cycles, pre-empting failure, and extending battery lifespan, though real-time deployment continues to face hurdles in computational efficiency and system integration.
Expanding AI’s influence beyond operational diagnostics, its application in material optimization presents a transformative path forward. Feng [29] demonstrated a data-driven workflow for the rapid screening of electrode material candidates and the prediction of optimal compositions, illustrating how such methods can bypass manual trial-and-error processes and systematically guide material selection. Similarly, Alzamer et al. [30] highlighted AI’s contribution to electrolyte design, addressing challenges of data scarcity by combining machine learning with automated text mining to predict composition-structure-property relationships for novel electrolytic systems. They presented a combined machine learning and automated text mining approach to map composition–structure–property relationships in electrolytes, addressing data scarcity while enabling the exploration of novel candidates through automated knowledge extraction. While these studies do not report quantitative benchmarking, they exemplify how AI frameworks can enhance the systematic and efficient discovery of materials, supporting sustainable energy storage innovations and broader advancements in materials science [31].
Advancements in time-series modeling have also enhanced the accuracy of SOH predictions. Wang et al. [32] proposed an optimized Bi-directional LSTM (BiLSTM) model trained using an adaptive convergence-factor-enhanced Gold Rush Optimizer (GRO), achieving superior SOH and capacity fade predictions compared to standard feedforward, LSTM, and other metaheuristic models. These improvements are attributed to enhanced global-local balance in parameter optimization, enabling high-fidelity forecasting of battery degradation behavior.
In terms of real-time applicability, Shahriar et al. [33] developed a hybrid CNN-GRU-LSTM model that achieved SOC estimation errors as low as 0.41% across varying temperatures, with minimal memory requirements and fast processing times (0.000113 s/sample). By combining the long-term dependency modeling of LSTM with the efficiency of GRU and the feature extraction capability of CNNs, their system delivers high accuracy with computational efficiency suitable for onboard BMS integration. Additionally, Mumtaz et al. [34] validated the effectiveness of unscented Kalman filters (UKF) for SOC and SOH estimation under dynamic EV operating conditions, demonstrating SOC error reductions to below 1% while maintaining cloud-integrated IoT connectivity for robust tracking and adaptive control.
At the system level, Arévalo et al. [35] conducted a systematic review of AI-integrated energy management systems (EMS) in EVs, showing improvements in predictive maintenance, adaptive route planning, and range extension (by 12–18%) through real-time optimization. However, they emphasized the need for improved cybersecurity, interoperability, and smart grid integration to ensure safe and scalable deployment. Ghazali et al. [36] echoed these concerns, identifying gaps in machine learning (ML) integration, lightweight algorithm design, and sensor cost reduction for real-time safety functions such as SOP and SOE monitoring. Their review calls for interdisciplinary efforts to enhance algorithm adaptability across diverse environmental and user-driven conditions.
Finally, Challoob et al. [37] explored the convergence of lithium-ion batteries and supercapacitors through bidirectional converters as a promising hybrid powertrain configuration. Despite marked progress in thermal stability, internal resistance estimation, and battery longevity, persistent limitations remain in system-level modeling, solar charging integration, and efficient control strategies. Addressing these challenges will be key to realizing the full potential of AI-enhanced BMS in enabling cost-effective, safe, and environmentally sustainable electric mobility solutions.

3.3. AI in Lithium-Ion Battery Life Prediction

Accurately predicting the RUL of lithium-ion batteries is critical for optimizing performance, safety, and cost-efficiency in EV applications. Recent advances in artificial intelligence, particularly in deep learning and ensemble-based regression, have significantly enhanced battery life forecasting capabilities. Zhang et al. [38] explored the use of deep transfer learning, employing a VGG16-based feature extraction framework to model capacity degradation. Their study revealed that Method II, incorporating comprehensive voltage and capacity data, most effectively captured aging dynamics across varying operational conditions. However, challenges such as temperature sensitivity and classification ambiguities remain, reinforcing the need for more robust feature integration techniques, including differential voltage and electrochemical impedance spectroscopy data.
In parallel, Sravanthi and Chandra Sekhar [39] performed a comparative analysis of machine learning regressors for RUL prediction using real-world datasets. Their findings showed that the Bagging Regressor outperformed other methods such as Gradient Boosting, K-Nearest Neighbors, and Extra Trees, achieving near-perfect R2 (0.999) with minimal errors (MSE: 14.307, RMSE: 3.782, MAE: 2.099). These results underscore the practical viability of ensemble models for efficient and precise RUL estimation in battery health management systems, with hybrid learning approaches identified as a promising avenue for further refinement.
Pushing the boundaries of sequence modeling, Liu et al. [40] introduced the PatchFormer architecture, an advanced patch-based Transformer framework featuring dual attention mechanisms. The Dual Patch-wise Attention Network (DPAN) captures global degradation trends, while the Feature-wise Attention Network (FAN) extracts temporal correlations and short-term anomalies, including capacity regeneration. The model demonstrated superior accuracy across public datasets, offering a scalable, open-source solution for high-fidelity battery life forecasting in real-world deployments.
Complementing these algorithmic innovations, Shaik et al. [41] provided a comprehensive review of advanced BMS frameworks, highlighting their importance in achieving sustainable development goals through intelligent control over charge/discharge cycles, thermal regulation, and SOC estimation. The review identified key technical barriers in current systems and emphasized the necessity for integrated AI methods to bridge the gap between high-performance forecasting and practical deployment.
Khawaja et al. [42] further examined a range of machine learning algorithms for SOC and SOH estimation. Their study found that random forest regression consistently outperformed traditional models, including linear regression, support vector machines, and gradient boosting, achieving R2 values of 0.9999 and low error metrics (MAE: 0.0035, RMSE: 0.0097). While the method proved effective for smaller datasets, the authors recommended future integration of deep learning or federated learning to accommodate larger-scale applications and complex battery behaviors. These findings reaffirm the importance of AI-driven approaches in advancing predictive maintenance and safety strategies for next-generation battery management systems.

3.4. Lithium-Ion Battery Energy Management with Machine Learning

Machine learning is playing an increasingly transformative role in the optimization of lithium-ion battery energy systems, with applications extending across design, manufacturing, real-time control, and lifecycle prediction. Valizadeh and Amirhosseini [43] provided a comprehensive overview of ML’s potential in enhancing battery research and development through efficient exploration of chemical and operational parameters. Their work emphasized key challenges such as data sparsity, computational complexity, and model interpretability, while proposing hybrid strategies, transfer learning, and self-improving algorithms as potential solutions. By bridging first-principle modeling with data-driven approaches, their framework offers a roadmap for scalable, cost-effective, and environmentally sustainable battery innovations.
Complementing this broader perspective, Ardeshiri et al. [44] reviewed the integration of ML into BMS and highlighted its impact on improving state estimation, fault detection, and operational monitoring in electric vehicle and grid applications. Their analysis revealed that various ML techniques, including supervised learning, neural networks, and ensemble models, exhibit different strengths depending on operational requirements. The authors underscored the need to balance accuracy, computational cost, and implementation complexity to achieve effective, adaptable battery control under real-world constraints.
The importance of explainability in battery ML models is increasingly recognized, especially as these models become central to safety-critical systems. Niri et al. [45] explored the application of explainable machine learning (XML) across the battery value chain, from manufacturing to state estimation and performance monitoring. Although feature importance analysis dominates current XML use cases, the study identified a lack of adoption for advanced methods such as Shapley values and counterfactual explanations and their limited use in neural network-based models. These insights underscore the urgent need to develop battery-specific XML frameworks and benchmark datasets to ensure trust, transparency, and informed decision-making in increasingly complex battery systems.
Focusing on RUL estimation, Jin et al. [46] demonstrated that deep neural networks (DNNs) have emerged as the most effective ML architecture for modeling nonlinear degradation behavior and predicting battery life with high generalization accuracy. Their review noted the growing role of RUL prediction in enabling predictive maintenance and optimizing battery utilization across EV and grid storage platforms, while also highlighting challenges in real-world implementation due to model complexity and data variability.
Together, these studies reveal that the synergy between advanced ML techniques, explainability tools, and robust BMS design is central to improving energy efficiency, operational safety, and lifecycle sustainability in lithium-ion battery systems. Continued research into tailored ML frameworks, particularly those capable of handling multi-scale, time-series, and hybrid data, will be essential in driving forward the next generation of smart, adaptive energy storage technologies.

3.5. Intelligent Systems for Lithium-Ion Batteries

The development of intelligent systems for LIBs is vital for enhancing safety, reliability, and digital adaptability across electric mobility and stationary energy storage applications. As thermal events remain one of the most critical barriers to widespread LIB adoption, Li et al. [47] proposed a multi-layered safety framework that integrates early detection, thermal management, and fire suppression strategies. Their study emphasized the synergistic effect of combining real-time monitoring technologies such as optical fiber sensors and ultrasonic imaging with advanced thermal management systems. Water-based fire suppression agents were identified as particularly effective for LIB incidents, and the proposed strategy encompasses both passive and active safety mechanisms, including thermal runaway characterization, intelligent detection, and adaptive cooling [48]. This framework forms a foundational guideline for designing safer battery systems capable of preventing catastrophic failures while ensuring performance continuity.
In parallel, the convergence of electrochemical modeling and artificial intelligence is redefining the landscape of smart battery systems. Amiri et al. [49] reviewed the integration of physics-based models with machine learning algorithms to develop hybrid modeling architectures that combine the mechanistic accuracy of fundamental battery equations with the adaptability and speed of data-driven techniques. These hybrid models demonstrate significant promise across the battery lifecycle, including design optimization, performance estimation, fault diagnosis, and operational control. By merging physics-informed simulations with machine learning’s pattern recognition capabilities, such frameworks address the limitations of standalone approaches, offering both explainability and computational efficiency. Their application in digital twin environments paves the way for real-time, intelligent battery systems capable of continuous monitoring, predictive maintenance, and lifecycle optimization.
Together, these intelligent approaches ranging from safety-oriented system design to hybrid digital architectures illustrate the emerging sophistication of next-generation LIB technologies. As battery systems grow more complex and their applications more critical, the role of AI-enhanced safety and hybrid modeling frameworks will be central in enabling smart, resilient, and scalable energy storage solutions.

3.6. AI-Based Predictive Analytics for Lithium-Ion Battery Systems

Artificial intelligence is reshaping the landscape of EV technology through its application in predictive analytics, enabling more intelligent, adaptive, and efficient battery systems. Cavus et al. [50] emphasized that AI-driven approaches leveraging machine learning, deep neural networks, and reinforcement learning outperform conventional methods in predicting critical battery states such as SOC and SOH, as well as in managing thermal performance. These systems enable dynamic energy optimization, enhance regenerative braking efficiency, and improve overall vehicle control across varying operational environments. Furthermore, the integration of IoT connectivity and big data analytics expands these capabilities to encompass fleet-level optimization and predictive maintenance. Nonetheless, challenges persist regarding real-time processing limitations, interpretability of AI models in safety-critical applications, and the imperative for robust cybersecurity measures. Addressing these constraints will be essential for deploying scalable, reliable AI architectures across heterogeneous EV platforms.
Zhao and Burke [51] extended this perspective by examining how AI enhances battery diagnostics through improved precision in SOH assessments and RUL predictions. Their study highlighted advanced machine learning strategies, including end-to-end architectures, federated learning, and multimodal time-series analysis that facilitate decentralized, privacy-preserving diagnostics while adapting to changing usage patterns. These developments contribute not only to BMS accuracy and responsiveness but also support environmental sustainability by enabling battery lifespan extension and efficient second-life deployment. Moving forward, future research should focus on self-learning algorithms capable of real-time model updating, the incorporation of diverse operational datasets, and enhanced integration of multimodal sensor data to ensure robust generalization across varying battery chemistries and driving conditions.
Collectively, these predictive analytics innovations signal a transformative shift in battery intelligence, where AI not only improves system efficiency and safety but also serves as a catalyst for broader transportation electrification. The convergence of real-time diagnostics, adaptive control, and lifecycle management positions AI as a central enabler of next-generation battery technologies.

3.7. AI in Renewable Energy Storage and Lithium-Ion Batteries

Artificial intelligence is emerging as a powerful enabler for enhancing sustainability across the LIB lifecycle, from environmental impact assessment to end-of-life recovery and recycling. Chen et al. [52] proposed an AI-driven framework for life cycle assessment (LCA) of LIBs, demonstrating how machine learning and pattern recognition techniques can significantly improve the speed, scalability, and granularity of sustainability evaluations. Their SWOT analysis emphasized the strengths of AI in automating data processing and predictive modeling while identifying critical challenges such as data standardization, transparency, and methodological robustness. The study highlights that interdisciplinary collaboration between battery experts and AI specialists is essential to develop unified protocols that ensure accurate and interpretable results. These efforts are crucial for aligning battery production and deployment with global environmental targets, particularly as electric mobility and renewable energy storage systems expand.
End-of-life management remains a pressing concern for LIB sustainability, particularly in the context of ternary cathode recovery. Ren et al. [15] provided a comprehensive analysis of recycling technologies, including mechanical, pyrometallurgical, hydrometallurgical, biotechnological, and direct recovery methods. Despite recent improvements in material purity and process efficiency, challenges such as complex cell architectures and energy-intensive processes persist. The integration of AI into recycling workflows offers promising solutions ranging from process optimization and real-time monitoring to intelligent sorting and predictive control, enabling more efficient and scalable recycling systems. Moreover, AI-guided design-for-recyclability strategies and automated disassembly systems could further advance circular economy initiatives by improving material recovery rates and reducing environmental impact.
Together, these developments underscore the transformative potential of AI in driving sustainable innovation across the LIB lifecycle. From upstream production to downstream recycling, intelligent systems will be pivotal in shaping environmentally responsible energy storage technologies and supporting long-term resource security in the electrification era.
Artificial intelligence is playing a pivotal role in transforming BMS for electric vehicles by enabling precise and adaptive control over energy storage performance. Mamidi et al. [53] highlighted that AI-enhanced BMS platforms significantly advance the real-time estimation of key battery metrics, namely, SOC, SOH, and fault diagnostics through intelligent analysis of charging behavior, current flow dynamics, and degradation signatures. These systems not only improve the accuracy of internal state predictions but also enhance thermal management and early fault detection capabilities, reducing safety risks and prolonging battery service life.
The integration of AI into BMS architecture marks a critical evolution in addressing both technical and environmental challenges associated with lithium-ion battery systems. By continuously optimizing battery usage through learning-based models, these intelligent systems support sustainable electric mobility while aligning with broader objectives in energy efficiency and environmental impact mitigation. As the demand for high-performance, scalable, and environmentally responsible energy storage continues to grow, AI-driven energy management frameworks are becoming indispensable components of next-generation EV platforms.

3.8. AI Techniques for Lithium-Ion Battery System Modeling

Artificial intelligence techniques have become essential in advancing the modeling of lithium-ion battery behavior, particularly for predicting aging and estimating SOC under real-world conditions. Mayemba et al. [54] demonstrated that machine learning models, including extreme learning machine-inspired networks and encoder-coupled architectures, achieve high predictive accuracy for capacity fade, with RMSEs as low as 1.3% and 2.7% across multiple datasets. These models incorporate battery science fundamentals into the input space and have been validated against data reflecting stressors such as temperature fluctuation, calendar aging, and dynamic load profiles. Their ability to generalize across heterogeneous aging scenarios enabled by unified stress factor zoning and temporal integration techniques surpasses the performance of commercial empirical models, offering a versatile framework for expanding into more nuanced degradation indicators such as internal resistance growth and electrode material loss.
Further enhancing battery model adaptability, Wang et al. [55] proposed a temperature-adaptive neural network framework for SOC estimation that dynamically selects between multi-layer neural networks (MNN), long short-term memory (LSTM), and gated recurrent unit (GRU) architectures based on real-time thermal conditions. Their GRU-based model achieved the lowest mean absolute error (2.15%), outperforming traditional approaches by over 50% during validation across simulated driving profiles and four distinct temperature ranges. By integrating temperature-dependent voltage, current, and temporal features, the model addresses critical challenges in battery state estimation, particularly under thermal stress conditions. Moreover, the system’s planned deployment on edge computing platforms like NVIDIA Jetson highlights its practical viability for real-time SOC monitoring in applications ranging from electric vehicles to unmanned aerial systems.
Together, these developments underscore the growing effectiveness of AI-driven modeling frameworks in capturing the complex, time-dependent behaviors of lithium-ion batteries. By integrating battery physics with intelligent architectures, these models contribute to safer, more efficient, and more adaptive energy storage systems suitable for a wide range of operational environments.

3.9. Intelligent Lithium-Ion Battery Management for Enhanced Performance

Expanding the role of intelligent energy management beyond traditional electric vehicle platforms, recent advancements demonstrate the potential of hybrid power systems that integrate lithium-ion batteries with alternative energy sources for improved dynamic performance. Elkerdany et al. [56] investigated a hybrid propulsion system for electric unmanned aerial vehicles (UAVs) combining polymer membrane fuel cells with Li-ion batteries, managed through an intelligent control architecture employing Fuzzy Logic Control (FLC) and Adaptive Neuro-Fuzzy Inference System (ANFIS) algorithms. Through detailed simulations across multiple flight modes, the study revealed that both control methods effectively regulated energy distribution between power sources while exhibiting distinct dynamic response profiles under varying operational demands.

3.10. Machine Learning Models for Lithium-Ion Battery Optimization

Zou [57] introduced a Bi-LSTM model enhanced with dual attention mechanisms, temporal and spatial, that delivers highly accurate lithium-ion battery SOH estimation, achieving low error margins (RMSE: 0.4%, MAE: 0.3%). The model leverages Differential Thermal Voltammetry (DTV) to link micro-phase transitions with observable battery behavior, enabling more effective feature weighting across time and space. This dual-attention strategy addresses limitations found in conventional single-attention models and improves estimation accuracy by around 10% over non-attention-based approaches. Its consistent performance across varying cycle starting points (error fluctuation <0.1%) and compatibility with cloud-integrated BMS underscore its practical utility in cyber-physical environments. The model marks a major step forward in SOH prediction by combining deep learning with explainable insights into battery degradation processes.

3.11. AI and Digital Twin Technologies for Fast Charging Optimization in Electric Vehicles

Recent studies highlight the integration of artificial intelligence and digital twin technologies for advanced lithium-ion battery management. Zhang et al. [58] developed a fast-charging optimization method using an enhanced DDPG algorithm, improving efficiency and lifespan but lacking real-world validation. Issa et al. [59] proposed a cloud-based digital twin using ensemble learning and external APIs for highly accurate SOC estimation (NRMSE = 0.00047), with future potential for deep learning integration. Iyer et al. [60] introduced ENDEAVOR, a heuristic algorithm combining digital twins and UKF for SOC estimation and EV charging node optimization, reducing grid load and user wait time. Yang et al. [61] presented a reliability-focused digital twin using stochastic modeling and Bayesian evolution, achieving <5% error in life prediction and over 50% cost savings. Together, these approaches demonstrate complementary advances in fast charging, real-time estimation, and lifecycle management, though further validation and system-level integration are needed. Table 1 summarizes foundational models and architectures developed for Li-ion battery state estimation and control.

4. Digital Twin Modeling for Lithium-Ion Battery Management

A typical approach for implementing battery digital twins is illustrated in Figure 3 and includes seven separate stages. First, sensors in physical batteries collect data, which are then integrated into digital models to represent the system (Step 2). Afterward, these data are handled by cloud services (Step 3), which makes it possible to execute machine learning (Step 4) and produce new analysis and predictions. Numerous simulation scenarios are set up to check several operating factors (Step 5), and 3D visual tools draw dynamic battery models to better explain the results (Step 6). A feedback loop (Step 7) uses the results from analytics to enhance the physical system and assist with improved decisions and ongoing improvement. The method highlights how battery management systems work through a layered and interactive digital twin approach.
Figure 4 shows that digital twin modeling aids in managing lithium-ion batteries over the whole product lifetime. During design, digital twins use physics, AI, and existing test results to find the best structure and efficiency for the battery before creating any samples. Given that the manufacturing line is nearly the same as its digital duplicate, any issue is spotted quickly, so processes can be changed as necessary. As part of the operations phase, the actual battery pack is linked with a digital twin via the sensors collecting voltage, current, and temperature. AI systems go through these data to keep an eye on battery performance, point out odd behaviors, and predict when the battery will fail. As user movements and battery condition keep changing, these models update automatically, allowing the system to adapt its control strategies and prolong battery use. The use of a digital twin gives better safety and reliability to lithium-ion batteries while also assisting in more intelligent energy handling for electric cars, smart charging points, and fleet systems.
Digital twin technology offers a transformative paradigm for managing lithium-ion battery systems by creating high-fidelity virtual replicas that integrate multi-scale modeling, real-time sensing, and intelligent analytics. Naseri et al. [14] outlined a layered digital twin architecture composed of connectivity, twin, and service modules, which leverage artificial intelligence, IoT integration, and cloud platforms such as ANSYS and Microsoft Azure. These systems provide actionable insights across the battery lifecycle, enabling functions such as SOC estimation, predictive maintenance, and lifecycle optimization. Reported outcomes include up to 60% reductions in maintenance costs and 15% improvements in battery lifespan through optimized charging strategies. However, technical challenges persist, including the need for standardized protocols, computational efficiency, and justifiable deployment costs. Emerging applications such as battery passports and second-life planning further extend the value of digital twins, supporting multi-stakeholder decision-making despite ongoing concerns related to data processing, interoperability, and model scalability.
Enhancing the practical application of digital twin frameworks, Wang and Li [62] demonstrated an integrated system combining extended Kalman filtering (EKF) for SOC estimation and particle swarm optimization (PSO) for SOH prediction. Their approach, validated through MATLAB/Simulink simulations and experimental setups, effectively addressed the challenges of Gaussian white noise in real-time environments. By fusing model-based estimators with data-driven optimization within an embedded platform, this implementation exemplified the real-world viability of digital twin architectures in monitoring and prognostics. The ability to bridge theoretical models with operational robustness marks a significant advancement in battery digitalization, providing a comprehensive foundation for intelligent performance assessment and long-term system reliability.

4.1. AI-Enabled Digital Twin Solutions for Lithium-Ion Battery Management

Artificial intelligence-enabled digital twin frameworks are emerging as powerful tools for optimizing lithium-ion battery management, enabling enhanced state estimation, real-time diagnostics, and intelligent control through the fusion of virtual modeling and physical sensor data. Kang et al. [63] demonstrated how digital twins, supported by mathematical modeling and three-dimensional visualization, enable real-time synchronization between physical and virtual systems. Their study of hybrid powertrains combining lithium-ion batteries and fuel cells revealed the utility of this approach for dynamic energy control and simulation accuracy, providing a scalable framework applicable to broader energy systems and transportation platforms.
Focusing on state estimation, Tang et al. [64] proposed a DT-driven system utilizing a hybrid HIF-PF algorithm, achieving an average SOC estimation error of just 0.14% under dynamic stress tests. This architecture addressed critical limitations in traditional battery management systems, such as data storage constraints, initial value sensitivity, and computational inefficiencies. Real-time monitoring was facilitated via an interactive visualization interface, laying the groundwork for future closed-loop DT systems capable of autonomously optimizing physical battery parameters.
In support of model robustness under thermal and nonlinear dynamics, Song et al. [65] introduced a hybrid CNN-LSTM network for SOC estimation that achieved mean absolute errors below 1.5% and robust performance across various ambient temperatures. The architecture leverages CNNs for spatial feature extraction and LSTMs for capturing temporal dependencies. A periodic parameter update mechanism further enhanced model adaptability in response to battery aging, suggesting practical deployment for real-world applications with minimal recalibration.
Khalid and Sarwat [66] proposed a unified ML framework combining minimized Akaike Information Criterion (m-AIC)-optimized ARIMA with NARX and MLP neural networks. Their model achieved RMSE values as low as 0.1323% and maintained high accuracy under low C-rate conditions. While current implementations are offline, their results support future integration into adaptive observers for real-time SOC forecasting in aging-aware, multi-pack systems.
Digital twins are also playing a transformative role in broader energy and smart grid applications. Das et al. [67] reviewed DT integration with machine learning in power systems, identifying key use cases including battery health prognosis, renewable energy integration, and cost projections for photovoltaic and wind systems. This study critically synthesizes how such integrated frameworks enable robust predictive analytics and optimized decision-making across diverse energy management scenarios, from component-level health to system-wide renewable utilization. Similarly, Kang [68] emphasized DT’s role in edge-cloud architectures and AutoML for software-defined vehicles, supporting real-time EV data analytics and traffic forecasting in resource-constrained environments. Kang’s work highlights the practical utility of these architectures in enhancing the intelligence and responsiveness of future vehicle systems, demonstrating efficient deployment of complex data processing and AI capabilities under computational limitations, which is vital for autonomous driving and smart mobility.
From a systems engineering perspective, Chai et al. [69] discussed DT deployment in intelligent connected vehicles (ICVs), emphasizing its role in lifecycle optimization, data handling, and modular environmental perception in urban traffic. Their study highlighted the benefits of partitioning tasks between edge and cloud infrastructure to improve scalability and processing efficiency. Wang et al. [70] further detailed the broader impact of digital twin frameworks on energy system operations, calling attention to real-time simulation, predictive analytics, and system-wide optimization for smart energy ecosystems.
Focusing specifically on lithium-ion battery applications, Zhao et al. [71] introduced a digital twin architecture integrating LSTM neural networks with EKF for SOC estimation. This hybrid system combines accurate initial SOC estimation from LSTM with EKF’s online correction capability, delivering enhanced robustness, lower error margins, and improved adaptability. The proposed rolling-learning architecture also supports future hierarchical DT models capable of simultaneously addressing SOC and RUL prediction, thereby advancing battery lifecycle management within renewable energy storage systems.
The reviewed studies underscore the critical role of DTs in modern energy and smart grid systems, particularly within ICVs. These works collectively demonstrate how DTs, integrating AI and edge-cloud computing, enable precise battery health prognosis, optimized renewable energy management, and real-time EV data analytics, crucial for enhanced system intelligence and responsiveness. A key insight from this body of work is the demonstrated capability of hybrid DT architectures to provide robust and adaptable battery state estimation (SOC/RUL), significantly advancing lifecycle management for Li-ion batteries in diverse applications, including renewable energy storage.

4.2. Lithium-Ion Battery Lifecycle Management with Digital Twins

Digital twin technology is emerging as a pivotal solution for next-generation battery lifecycle management, offering a scalable and intelligent framework for real-time performance monitoring, predictive maintenance, and system optimization. Elkerdany et al. [13] proposed a DT-based architecture that enhances BMS capabilities by integrating high-fidelity virtual representations with AI-powered analytics. Their study highlighted the potential of DTs to address key limitations in traditional BMS, including fragmented data streams and a lack of predictive functionality. By enabling synchronized physical-virtual monitoring, the framework supports cost-effective implementation across a range of lithium-ion battery applications. However, challenges remain in ensuring accurate data synchronization, maintaining model fidelity under dynamic operating conditions, and reducing computational overhead, requiring further research for full deployment at scale.
Building on these advancements, Lakshmi and Sarma [72] introduced a digital twin framework that fuses artificial intelligence and IoT technologies for electric vehicle energy storage systems. Their approach achieved high predictive accuracy, with a mean absolute error of 0.042 and a root mean square error of 0.055, alongside low system latency (0.12 s) and fast feedback responsiveness (0.45 s). Notably, the system improved energy efficiency by 15%, minimized capacity fade to 0.0025% per cycle, and demonstrated strong anomaly detection performance (95% precision, 76% detection rate). These results validate the effectiveness of digital twins in enhancing battery health and longevity while supporting advanced control and diagnostics. The seamless integration of AI, IoT, and DTs sets a new benchmark for intelligent battery lifecycle management in electric vehicles, offering a robust pathway toward more resilient and adaptive energy storage ecosystems.

4.3. Real-Time Digital Twin Systems for Lithium-Ion Batteries

A novel digital twin-based BMS architecture has recently been proposed, marking a significant advancement in real-time monitoring of lithium-ion batteries for electric vehicle applications. This dual-model framework integrates onboard tracking with cloud-based analytics to enable comprehensive and adaptive SOH estimation throughout a battery’s operational life [73]. Central to the system’s innovation is the application of advanced feature engineering and dynamic model retraining, which collectively address longstanding challenges in conventional data-driven approaches, particularly the inability to accurately capture battery degradation during partial charge cycles, a common but underrepresented condition in real-world usage. Through validation across multiple datasets, the proposed architecture has demonstrated robust performance, maintaining accurate predictions of both SOC and state-of-energy (SOE) while accommodating the evolving degradation patterns of lithium-ion batteries. This fusion of real-time data with long-term predictive analytics establishes a new standard for intelligent BMS design, offering a resilient solution that accounts for the inherent variability in battery behavior under diverse driving conditions. Ultimately, such architectures not only enhance operational reliability and performance but also contribute to broader goals in electric mobility and climate change mitigation by optimizing energy efficiency and lifecycle sustainability [74].

4.4. Integration of AI and Digital Twins in Lithium-Ion EV Batteries

Recent advancements in lithium-ion battery performance prediction have been realized through the integration of artificial intelligence and digital twin frameworks, particularly those combining metaheuristic optimization with ensemble learning techniques. One such approach incorporates improved gray wolf optimization with the AdaBoost algorithm to enhance predictive accuracy of discharge capacity within digital twin environments. Validated through rigorous ten-fold cross-validation on the NASA battery aging dataset, this hybrid model demonstrated exceptional performance, achieving mean absolute and root mean square errors as low as 0.01, substantially outperforming both standard and stacked LSTM architectures [75]. The enhanced predictive fidelity of this AI-driven framework presents a significant breakthrough in addressing persistent challenges related to energy storage reliability and operational efficiency. Furthermore, its applicability extends beyond electric vehicles to broader renewable energy storage domains, enabling refined battery design, predictive maintenance, and improved integration of intermittent renewable resources. These benefits collectively contribute to the development of resilient, low-carbon energy infrastructures through smarter battery management technologies. As emphasized in related studies, however, widespread deployment of such digital twin systems must contend with several practical constraints, including the need for higher fidelity data streams, increased computational scalability, robust cybersecurity protections, and harmonized communication protocols across diverse BMS architectures [76]. Despite these challenges, the potential for AI-integrated digital twins to revolutionize battery lifecycle management remains evident. Their ability to facilitate real-time monitoring, dynamic state estimation, and long-term degradation tracking positions them as foundational components in next-generation battery systems, particularly in electric vehicles and smart energy grids [13].

4.5. Digital Twins for Predictive Maintenance in Lithium-Ion BMS

Recent developments in predictive maintenance for lithium-ion BMS have leveraged digital twin frameworks to achieve greater accuracy in RUL estimation and improved maintenance strategies. One such approach integrates stochastic degradation modeling with Bayesian-based adaptive evolution to dynamically capture the stochastic and nonlinear degradation behaviors of batteries throughout their lifecycle. This method demonstrated high prediction accuracy with error margins maintained around 5%, while enabling cost reductions in predictive maintenance operations by up to 62%, highlighting its practicality for real-world deployment [61]. Further extending these capabilities, Zhao et al. introduced a Hierarchical and Self-Evolving Digital Twin (HSE-DT) model that combines Transformer and convolutional neural network (CNN) architectures with transfer learning to enable dynamic adaptation under diverse operating conditions. The system achieved root mean square error values below 0.9% for SOC and 0.8% for SOH, emphasizing both its precision and real-time responsiveness [77]. In parallel, Jafari and Byun proposed a hybrid digital twin framework combining Extreme Gradient Boosting (XGBoost) with the EKF, offering a balance between nonlinear learning and real-time state estimation. This integrated model demonstrated enhanced robustness in tracking battery aging phenomena, particularly through online model updates and adaptive filtering strategies [78].
Despite these advances, concerns surrounding cybersecurity remain a critical barrier to deployment. Pooyandeh and Sohn identified vulnerabilities in digital twin-based SOC estimation systems to timestamp attacks, where malicious disruptions in data chronology compromised the reliability of real-time monitoring. This highlights the need for robust defense protocols to protect battery systems from covert data manipulations and ensure secure operation in connected electric vehicle networks [79]. Additionally, large-scale implementation of ML-driven digital twins introduces further complexities, including the challenges of managing massive data volumes, establishing reliable communication protocols, and securing data privacy. Kaleem et al. emphasized that while ML-enhanced digital twin models employing deep learning, neural networks, and support vector machines offer remarkable improvements in battery performance and longevity, their successful application demands scalable infrastructure and cloud-based systems to support continuous learning and remote diagnostics at fleet-wide scales [80]. Collectively, these innovations underscore the evolving potential of digital twins in predictive maintenance, while pointing toward necessary improvements in system integration, model generalization, and cybersecurity to ensure dependable long-term deployment.

4.6. Digital Twin Simulations for Lithium-Ion Battery Behavior Prediction

Emerging digital twin frameworks are increasingly employing advanced simulations and data-driven methods to enhance the predictive capabilities of lithium-ion battery behavior under varying operational conditions. A notable implementation based on the Industrial Internet-of-Things (IIoT) paradigm utilized Microsoft Azure cloud infrastructure to fuse real-time sensor data with historical driving patterns using three distinct APIs. Through a supervised voting ensemble machine learning algorithm, this architecture achieved high-fidelity estimation of SOC, yielding normalized root mean square errors of 1.1446 in simulation and 0.02385 during experimental validation. By systematically coordinating data acquisition, synchronization, and visualization processes, the model demonstrated robust adaptability to dynamic driving scenarios, offering practical insights for electric vehicle range prediction and predictive maintenance planning [59].
Further enhancing simulation fidelity, Bandara and Halgamuge developed a hybrid digital twin architecture that merges conventional mathematical modeling with an adaptive LSTM network, fine-tuned using Adam optimization and early stopping regularization techniques. Trained on real-world NASA battery cycling data, the model exhibited a 68.42% reduction in capacity estimation error relative to conventional digital twin approaches. Its ability to track subtle patterns of degradation across diverse usage profiles underscores the value of integrating physical and data-driven methodologies for lifecycle-aware battery management [81]. Complementing these efforts, Hang et al. [82] presented a digital twin approach that dramatically reduces experimental load by up to 99% through numerical electrochemical simulations used to synthetically augment limited experimental datasets. This hybrid model achieved capacity prediction errors below 2%, showcasing the potential of combining domain-informed simulations with machine learning to overcome data scarcity. However, widespread adoption is contingent on automating the calibration of electrochemical models, which currently requires expert intervention, thus pointing to future directions for model generalization and deployment scalability. Collectively, these hybrid digital twin approaches establish a promising foundation for predictive modeling that synergistically harnesses both simulation and real-world data to support intelligent battery diagnostics, optimized maintenance, and extended operational life. Table 2 presents emerging sensor technologies and their integration within advanced battery monitoring frameworks.

5. Cloud-Based Lithium-Ion Battery Management Systems

Figure 5 displays a digital twin framework developed to elevate the performance of lithium-ion BMS. At the beginning, the system includes physical equipment that looks at how the battery operates in real time, including the level of temperature, voltage, current, and how charged the battery is. These data are handled locally by edge computing, after which they are kept for future use. Using all these data, the BMS is able to run simulations to show battery performance and predict future problems like aging or failures. Because these models are regularly enhanced, the BMS can make better decisions. The graphs users see can be sent to other systems for further use in homes, vehicle batteries, or factories. From the battery to the cloud, data travel securely through the layers, so battery monitoring becomes more precise, practical, and smart.
As shown in Figure 6, smart battery systems can be monitored and controlled with help from a combination of edge and cloud computing features inside an advanced BMS. In part one of the figure, sensors continuously measure temperature, voltage, current, and State-of-Health in the batteries and send this information to nearby edge devices that process it immediately. If the edge devices cannot manage complicated tasks, the unprocessed data are forwarded to a controller, then sent to the cloud for further study, improvements, and prediction. The second section of the figure covers digital twins, where information from physical batteries helps build virtual models that replicate battery actions based on data, physical functions, and behaviors. Thanks to these models, BMS can perfect its control solutions, spot any malfunctions promptly, and lengthen the battery’s service life. Having physical, communication, storage, access, and service layers enables the system to keep data safe and efficient. By using this special approach, bringing together cloud and digital twin, intelligent battery handling becomes possible across homes, factories, and vehicles.
The convergence of cloud computing and BMS has emerged as a pivotal enabler for advancing EV performance, scalability, and reliability. Cloud-based BMS architectures offer substantial advantages by overcoming onboard computational constraints, enabling remote diagnostics, real-time data analytics, and scalable software updates. These systems significantly expand the capabilities of conventional BMS by supporting predictive maintenance, fleet-level energy optimization, and enhanced visualization of operational data. However, key technical challenges remain in the implementation of robust online learning algorithms, ensuring secure connectivity, and fully exploiting fleet-scale data streams for adaptive control [83]. Foundational work in BMS design has emphasized the importance of accurate performance modeling, efficient state estimation, and adaptive control strategies, particularly for SOC, SOH, and state-of-function (SOF) tracking to address safety, durability, and cost issues that remain obstacles to widespread EV adoption [84].
Progress toward operationalizing digital twin frameworks within cloud-based systems has been marked by implementations that integrate advanced filtering algorithms and optimization techniques. A notable example is the use of adaptive extended H-infinity filters for SOC estimation and particle swarm optimization to simultaneously track capacity and power fade. This hybrid architecture demonstrated robust accuracy across lithium-ion and lead-acid chemistries under varying initialization and dynamic loading conditions, representing a critical milestone in translating digital twin theory into real-world applications. The system’s successful deployment in both stationary and mobile environments underscores its potential for scalable, cloud-integrated diagnostics and lays the groundwork for future incorporation of machine learning models to enhance lifetime prediction and system optimization [85]. Complementing this, cloud-based condition monitoring platforms utilizing IoT modules and distributed computing, such as those implemented on Raspberry Pi and Google Cloud, have demonstrated cost-effective and accurate health assessments across large-scale lithium-ion battery systems. These platforms successfully overcome traditional scalability limitations and establish a centralized framework for data-driven battery energy storage management, setting a precedent for future industrial-scale deployments [86]. Together, these developments highlight the transformative potential of cloud-BMS ecosystems, where distributed intelligence and centralized analytics enable more resilient, efficient, and intelligent battery system control.

5.1. AI-Powered Cloud Solutions for Lithium-Ion Battery Management

The integration of artificial intelligence within cloud-based BMS presents a transformative shift in electric vehicle and smart grid applications. By merging scalable cloud computing with data-driven machine learning algorithms, this framework addresses the limitations of traditional onboard systems, enabling enhanced state estimation, including charge, health, and safety, and improved thermal management. The combined use of physical battery models and AI techniques allows for more accurate, multi-timescale predictions, supporting proactive control strategies. Leveraging cloud infrastructure, the system efficiently processes extensive operational data streams to deliver real-time insights without the need for costly hardware upgrades, thereby advancing both battery reliability and safety in large-scale energy storage applications [87].

5.2. Edge AI for Real-Time Lithium-Lon Battery Monitoring

Edge artificial intelligence is increasingly emerging as a viable solution for real-time lithium-ion battery monitoring, particularly in scenarios where onboard computational resources are constrained. A notable advancement in this domain involves the deployment of a compact neural network architecture capable of performing simultaneous SOC and SOH estimation using electrochemical impedance spectroscopy (EIS) and temperature data. By systematically optimizing network structure and applying quantization techniques, this framework significantly reduces both memory usage and processing demands, enabling its deployment on edge devices without compromising predictive accuracy. Such innovations effectively bridge the gap between advanced diagnostic capabilities and the physical limitations of embedded platforms, making them particularly suitable for predictive maintenance applications in portable and distributed battery-driven technologies [88].
Expanding this concept, the Intelligent Battery Management System (IBMS) integrates end-edge-cloud connectivity, digital twin modeling, and blockchain security into a multilayered, reconfigurable framework designed to optimize performance, safety, and system-level adaptability. Through dynamic battery reconfiguration, real-time simulations, and secure vehicle-to-grid and vehicle-to-infrastructure data exchange, this architecture facilitates predictive analytics, enhances operational efficiency, and supports next-generation mobility solutions. The inclusion of cybersecurity measures and adaptive learning further reinforces the system’s robustness, positioning it as a comprehensive solution for modern energy and transportation networks [89]. More broadly, the convergence of AI with battery prognostics and health management (PHM) marks a paradigm shift in how multi-scale battery dynamics are understood and controlled. By leveraging big data, IoT, and deep learning techniques, AI-centric PHM frameworks enable predictive modeling that accounts for variability in battery materials, usage conditions, and manufacturing inconsistencies, thus improving accuracy, safety, and system longevity. This evolution reflects the growing influence of Industry 4.0 technologies in transforming battery system intelligence from reactive control toward proactive optimization [90].

5.3. Cloud and Edge Computing for Lithium-Ion EV Batteries

Figure 7 demonstrates an architecture that allows for both intelligent battery monitoring and control. Infrastructure-as-a-Service (IaaS) passes battery information over edge networking to cloud servers. Then, the cloud, including Data-as-a-Service (DaaS) and Platform-as-a-Service (PaaS), evaluates these readings using advanced algorithms to estimate the system’s state and maintain it. The analytics are provided to users through software offered over the internet or Software-as-a-Service (SaaS), allowing them to make decisions in real time and make battery systems in electric vehicles better, safer, and more reliable.
The Cyber-Physical Battery Management System framework, shown in Figure 8, integrates part of the physical battery system in electric vehicles with computing and data management within a cloud setting, all connected through communication channels. The job of the physical layer is handled by the BMS, which constantly tracks cells’ voltages, currents, and temperature and then sends these data using a communication system. Aggregated data from multiple vehicles are processed by cleaning modules, cross-validation, and deep learning on the cyber layer and then shared in a centralized database as reliable battery models. Data from the battery system and control signals can be sent to the vehicles, and operational data can be sent to the cloud, ensuring better reliability, greater scalability, and increased intelligence for monitoring and managing battery systems.
The integration of cloud and edge computing technologies has significantly redefined the architecture of lithium-ion BMS, offering a transformative approach to real-time monitoring, predictive diagnostics, and lifecycle optimization. Cloud-based digital twin frameworks, in particular, enable the creation of virtual replicas of battery systems by fusing real-time sensor streams with historical usage profiles. This fusion supports high-fidelity predictions of SOC, SOH, and degradation trajectories, thereby improving operational reliability, reducing maintenance costs, and extending battery longevity. Such models are especially impactful in industrial applications like electric two- and three-wheeler manufacturing, where early failure prediction and dynamic simulation can mitigate operational risks, although technical implementation hurdles remain [91].
Expanding upon this paradigm, smart cloud-enabled BMS architectures offer a scalable alternative to traditional systems, effectively circumventing limitations related to computational overhead and data storage. These systems enhance fault diagnosis, performance optimization, and condition monitoring while remaining adaptable to emerging storage chemistries such as lithium-sulfur and solid-state batteries. However, challenges including connectivity dependence, cost control, and outage resilience continue to limit their large-scale adoption, necessitating further real-world validation of battery models and algorithms [92]. A complementary direction is the development of cloud-end collaborative BMS frameworks, which leverage cloud infrastructure in conjunction with embedded intelligence. One such architecture integrates GRU neural networks with transfer learning to improve SOC estimation across heterogeneous battery platforms, achieving both computational efficiency and robust scalability for renewable and electric vehicle applications [93].
More broadly, the convergence of cloud computing, artificial intelligence, digital twins, and edge computing technologies is establishing cloud-based BMS (CBMS) as a next-generation solution for energy storage system intelligence. These systems enable real-time analytics and adaptive decision-making while addressing the need for scalable, secure, and efficient data processing pipelines. Nevertheless, key challenges remain in ensuring data integrity, algorithmic robustness, and cyber-resilience, indicating the need for continued innovation in data harmonization, model refinement, and system-level security to fully harness the potential of CBMS technologies [94].

5.4. Distributed AI in Lithium-Ion Battery Systems

Distributed and decentralized architectures are increasingly gaining prominence as viable alternatives to conventional centralized BMS, particularly in applications requiring modular scalability, enhanced reliability, and real-time monitoring capabilities. A distributed BMS based on STM32 microcontrollers exemplifies this trend by reducing computational burdens across the system while retaining high measurement accuracy and performance stability. Experimental validation on lithium iron phosphate battery packs confirmed its efficacy in maintaining precise voltage and temperature readings, as well as achieving effective cell balancing and safety protection. This system demonstrates the practical benefits of decentralized processing in managing energy distribution, monitoring system states, and optimizing battery pack reliability [95].
Building upon this, decentralized smart BMS frameworks have evolved further by integrating cloud-connected controllers, charge regulation systems, and networked sensors capable of continuously assessing key battery parameters, including SOC, SOH, current, voltage, and thermal conditions. Leveraging Petri Net modeling, this architecture enables intelligent energy optimization, predictive diagnostics, and remote-control functions. Its successful deployment in off-grid environments underscores its potential to improve battery lifespan, enhance safety, and support renewable energy utilization by dynamically adapting to fluctuating operational conditions [96]. At the core of these advancements lies the increasing role of AI and machine learning in reshaping battery research and development. These techniques allow for rapid interpretation of complex, high-dimensional datasets, identifying critical material, fabrication, and performance patterns that would otherwise remain inaccessible via traditional methods. While ML has demonstrated promise in areas such as cycle life prediction, degradation diagnostics, and safety forecasting, its full adoption will require interdisciplinary collaboration among experimentalists, data scientists, and domain experts to ensure data quality, model interpretability, and real-world applicability. Nevertheless, AI-driven discovery frameworks are poised to accelerate progress in battery innovation, offering a pathway to address some of the most persistent challenges in energy storage science [97].

5.5. Cloud Platforms for Lithium-Ion Battery Health Prediction

The convergence of cloud computing and AI in BMS represents a pivotal advancement in the predictive modeling and real-time supervision of lithium-ion batteries. By offloading computational workloads to cloud platforms, BMS architectures can process extensive operational datasets, enabling more accurate state estimation, degradation tracking, and predictive maintenance. This cloud-enabled framework supports scalable and efficient management strategies across both electric vehicle fleets and stationary energy storage systems. A key innovation lies in the integration of data-driven models with physics-informed machine learning, which enhances the system’s ability to capture complex nonlinear battery behaviors and long-range interdependencies often missed by traditional methods. Such hybrid approaches offer improved diagnostic fidelity and responsiveness, though their effectiveness depends on the continued refinement of learning algorithms and seamless communication between edge devices and centralized cloud infrastructure. As this technology matures, AI-enhanced cloud platforms not only address longstanding limitations in safety and performance forecasting but also facilitate the transition of academic research into practical, real-world energy storage solutions [87].

5.6. Scalable Cloud Solutions for Lithium-Ion Battery Optimization

Scalable cloud-based architectures are playing an increasingly critical role in advancing lithium-ion battery optimization by integrating real-time monitoring with long-term performance analytics. A recent framework exemplifies this trend by combining automotive-grade hardware with efficient Controller Area Network (CAN) data decoding and visualization capabilities, enabling precise tracking of essential battery parameters such as cell voltage and temperature. This system not only facilitates the early identification of weak cells, thereby enhancing operational safety and performance, but also balances data resolution with computational efficiency, a critical trade-off in cloud-connected BMS implementations. Experimental validation confirms the framework’s robustness, revealing how variations in sampling rate and server location influence latency and resource utilization. These insights are vital for the deployment of digital twin models and real-time analytics in electric vehicle applications, where responsiveness and reliability are paramount. Moreover, the architecture’s capacity for long-term data retention supports second-life battery applications, offering a unified solution that bridges real-time diagnostics with retrospective performance evaluation essential for enabling predictive maintenance, extended lifecycle management, and future reuse strategies in electric mobility [98].

5.7. Low-Latency Edge Computing in Lithium-Ion BMS

Low-latency edge computing has emerged as a crucial enabler for enhancing the responsiveness and accuracy of lithium-ion BMS, particularly in electric vehicle applications where real-time state estimation is essential. A cloud-edge battery management system (CEBMS) exemplifies this integration by combining cloud-based deep learning for multi-source data mining with the localized execution of optimized predictive models at edge nodes. This distributed architecture enhances voltage and energy state estimation while preserving computational efficiency at the vehicle level. Experimental results validate the system’s effectiveness in improving model fidelity and state prediction accuracy relative to conventional onboard systems, highlighting its capacity to support scalable, fleet-wide battery optimization strategies. By bridging the computational strengths of the cloud with the real-time processing needs of the edge, this architecture represents a significant step toward intelligent, adaptive BMS frameworks capable of sustaining high-performance electric mobility through distributed, data-driven decision-making [99].

5.8. AI-Driven Cloud Infrastructure for Lithium-Ion Battery Optimization

Artificial intelligence continues to reshape the landscape of lithium-ion battery research and management by enabling advanced analytics for state estimation, predictive control, and materials discovery. Techniques such as neural networks, reinforcement learning, and federated learning have proven instrumental in enhancing SOC and SOH estimation, detecting early signs of failure, and optimizing charging strategies. These data-driven innovations also accelerate the design of next-generation battery materials through computational frameworks, though challenges persist in data standardization, model transparency, and seamless industrial integration. Nevertheless, emerging approaches like privacy-preserving federated learning and adaptive reinforcement learning offer promising pathways to address these limitations, enabling intelligent, decentralized, and secure optimization of battery systems. As AI frameworks mature, their potential to support renewable energy integration, intelligent grid operation, and sustainable electric mobility becomes increasingly evident, signaling a fundamental shift toward environmentally conscious and efficiency-driven energy storage infrastructures [100].
In parallel, cloud-based methodologies for battery model optimization have demonstrated strong potential for bridging the gap between controlled laboratory studies and real-world electric vehicle deployment. A notable example involves the use of a moving window least squares (MWLS) algorithm integrated within a cloud infrastructure to estimate ECM parameters using long-term operational data. Over an eight-month field trial, this system achieved voltage prediction errors comparable to laboratory benchmarks while also tracking subtle degradation trends through resistance monitoring. By enabling sophisticated offline parameter tuning without the need for high-performance onboard computing, this framework offers a scalable solution for real-time diagnostics and fleet-wide health monitoring. It underscores the practicality of cloud-enabled parameter estimation for enhancing predictive accuracy and operational reliability in modern battery-powered transportation systems [101]. Table 3 details recent advancements in cloud-based and AI-powered battery management systems, emphasizing digital twin implementations and edge-cloud connectivity.

6. State of Charge Estimation with AI in Lithium-Ion Batteries

The estimation of SOC in lithium-ion batteries has been significantly enhanced through the integration of artificial intelligence with physics-based models (PBMs), offering a robust balance between accuracy and computational efficiency. A promising hybrid framework employs a sequential modeling strategy that combines a reduced-order PBM with an LSTM neural network. In this configuration, key internal battery variables are extracted using the reduced-order model and then passed to the LSTM alongside external measurement data, yielding superior estimation accuracy with reduced computational burden. An innovative feature selection technique, based on regression scoring, further improves efficiency by eliminating redundant inputs from the PBM. Experimental validation under dynamic driving profiles confirms that this hybrid approach not only outperforms full-order PBMs but also provides a scalable and practical solution for real-world battery management systems where rapid, accurate estimation is essential [102].
Complementing these developments, AI-based methods such as feed-forward ANNs have shown strong potential for SOC estimation in EV applications. When trained on current and voltage inputs, ANN models achieve mean absolute errors ranging from 0.5% to 1.4% under a range of conditions, including variable storage temperatures and high-stress load profiles, outperforming traditional coulomb counting and support vector machine (SVM) methods. Although further validation with real-time hardware data is necessary to confirm implementation feasibility, the demonstrated performance of ANN models in simulated cloud environments underscores their relevance for next-generation BMS. These findings reflect the broader trend toward machine learning-driven battery diagnostics, where AI techniques are increasingly recognized as critical tools for managing the complexities of dynamic EV operation [103].

6.1. SOC Prediction Using Neural Networks for Lithium-Lon Batteries

Neural network-based approaches for lithium-ion battery SOC prediction have demonstrated substantial advancements in both accuracy and generalizability, particularly through the integration of physical knowledge into learning frameworks. A notable development in this domain is the use of a PINN architecture that combines dual-branch processing of current sensor data with forward-prediction of SOC under load profile variations. During training, the model incorporates battery dynamics equations, enabling it to learn constraints rooted in electrochemical behavior. This hybrid design not only enhances prediction fidelity across extended time horizons but also reduces model complexity, outperforming traditional data-driven neural networks and standalone physics-based estimators across multiple public datasets. The result is a compact and robust model capable of supporting real-time and predictive SOC estimation with improved accuracy and adaptability, key capabilities for next-generation power management strategies [104].
Further reinforcing the efficacy of deep learning, recent studies have shown that DNNs offer superior SOC estimation performance when compared to LSTM models, particularly under the influence of aging and temperature-dependent effects. A carefully configured DNN model optimized through flexible choices of activation functions, layer depth, and neuron distribution achieved mean SOC prediction errors below 0.5% and peak errors under 2.5%. By incorporating critical input features such as thermal profiles and cell aging indicators, this approach significantly extends the robust operating of battery management systems. The findings underscore the importance of accounting for both thermal conditions and degradation dynamics in real-world electric vehicle environments, while also highlighting the potential for extending such frameworks to estimate SOH and RUL across different battery chemistries [105].

6.2. SOC Estimation with Support Vector Machines for Lithium-Ion Batteries

Support vector machines have emerged as powerful tools for lithium-ion battery SOC estimation, particularly when integrated with advanced filtering or optimization strategies to enhance accuracy under complex operational conditions. A hybrid algorithm combining SVM with the adaptive extended Kalman filter (AEKF) has demonstrated superior performance in SOC estimation by leveraging the dynamic noise adaptation of AEKF and the pattern recognition capabilities of SVM. Experimental validation under Hybrid Pulse Power Characterization (HPPC) and Dynamic Stress Test (DST) scenarios confirmed maximum estimation errors as low as 0.037% and 0.335%, respectively, representing significant improvements over standalone AEKF implementations. This fusion approach proves especially effective under the variable load conditions of electric vehicle applications, offering a robust and accurate solution for real-time SOC tracking and improving the overall reliability of battery management systems [106].
Further extending the application of machine learning, the genetic algorithm-optimized support vector regression (GASVR) model demonstrates exceptional robustness across a wide range of thermal and dynamic conditions. By emulating evolutionary processes such as selection, crossover, and mutation, the genetic algorithm systematically tunes SVR hyperparameters in Hilbert space to solve the inherent nonlinearities in SOC prediction. The resulting model outperforms traditional regression techniques, including XGBoost, random forest, and standard SVR, particularly under extreme temperature ranges from −20 °C to 25 °C. This thermal resilience and generalization capacity establish GASVR as a valuable framework for integrated battery state estimation, with potential for future expansion to SOH and RUL prediction. Its demonstrated accuracy and adaptability position it as a critical component in the development of intelligent, climate-resilient battery management systems for next-generation electric vehicles [107].

6.3. SOC Prediction Using Fuzzy Logic for Lithium-Ion Batteries

Fuzzy logic-based models have shown considerable promise in addressing the challenges associated with lithium-ion battery SOC estimation, particularly under conditions of uncertainty, noise, and nonlinearity. A notable advancement in this area is the development of the Type-2 Fuzzy Cerebellar Model Neural Network (Type-2 FCMNN), which incorporates type-2 fuzzy inference into the cerebellar model neural network framework to enhance prediction robustness and accuracy. By effectively accounting for system uncertainties and measurement disturbances, this architecture significantly improves estimation performance, reducing the mean absolute error and root mean square error to 43.1% and 36.0%, respectively, compared to conventional FCMNNs. Simulation results in MATLAB/Simulink environments confirm the model’s capability to manage the complex nonlinear dynamics of battery behavior, establishing a new standard for SOC estimation reliability under realistic noise and environmental variability [108].
Complementing this approach, the integration of an extended Kalman filter EKF with a Takagi–Sugeno fuzzy model (TSFM) provides a powerful framework for iterative modeling of battery dynamics. Validated under the rigorous Worldwide Harmonized Light Vehicle Test Procedure (WLTP), this hybrid model achieved impressive accuracy, with mean absolute errors below 70 mV and RMSE as low as 21 mV. The system’s adaptive nature enables it to update parameters in real time, thereby accommodating battery aging, temperature fluctuations, and load variability, all critical factors in electric vehicle operation. By refining an initially uncalibrated model through iterative feedback, this approach effectively bridges the gap between theoretical modeling and practical deployment. It also establishes a flexible foundation for advanced state estimation and energy management strategies applicable to a wide range of battery chemistries and configurations, including both full electric and hybrid electric vehicles [109].

6.4. SOC Estimation with Ensemble Learning for Lithium-Ion Batteries

Recent advancements in hybrid modeling techniques have significantly improved the accuracy and robustness of SOC estimation in lithium-ion batteries, particularly under challenging operating conditions. A notable contribution is the fusion of a CNN-Transformer architecture with the square root unscented Kalman filter (SRUKF), which demonstrates stable SOC estimation performance in low-temperature environments ranging from −20 °C to 0 °C. This approach combines the deep learning model’s strong pattern recognition capabilities with SRUKF’s noise-reduction and uncertainty-handling features, effectively mitigating the shortcomings of Coulomb counting, parameter-sensitive model-based estimators, and noise-prone purely data-driven models. Ensemble learning further enhances the system’s adaptability to temperature fluctuations, while empirical testing confirms its resilience to initialization errors as high as 40% SOC deviation. Compared to LSTM and GRU networks, the CNN-Transformer model achieves superior estimation accuracy, and its integration with SRUKF yields performance improvements of over 30%, highlighting its suitability for real-world electric vehicle applications. Additionally, the system’s compatibility with cloud-based deployment and potential for integration with SOH estimators positions it as a versatile platform for next-generation BMS [110].
In parallel, Gaussian process regression (GPR) has emerged as a highly accurate machine learning method for real-time SOC estimation, delivering root mean squared errors as low as 0.8% and mean squared error values of 0.6 when applied to comprehensive field datasets encompassing variable driving conditions, temperatures, and usage profiles. GPR’s non-parametric structure allows it to model complex nonlinear relationships among electrical and environmental variables, such as voltage, current, temperature, and humidity, with exceptional precision. This method consistently outperforms alternative machine learning techniques, including neural networks, support vector machines, and ensemble learners, while maintaining a high degree of generalization across different use cases. Although its deployment in real-time applications requires attention to computational overhead, GPR offers a powerful and practical framework for enhancing SOC estimation and safety monitoring in modern electric vehicle battery management systems [111].

6.5. Advanced Machine Learning for SOC Prediction in Lithium-Ion Batteries

Feedforward neural networks (FNN) have demonstrated considerable promise for lithium-ion battery SOC estimation, offering a reliable solution for both real-time monitoring and long-term degradation analysis. Leveraging the NASA PCoE dataset, the proposed FNN-based method achieves high predictive accuracy while minimizing RMSE, outperforming several traditional machine learning models in comparative evaluations. Importantly, the approach avoids typical pitfalls such as overfitting and underfitting, ensuring consistent performance across varying operational conditions. Its implementation in MATLAB further underscores the algorithm’s practicality, as demonstrated by the close alignment between predicted and actual SOC curves. In addition to accurate SOC estimation, the methodology provides valuable insights into degradation trends, thereby supporting more informed end-of-life prediction and contributing to the advancement of diagnostic tools for ultra-long-life lithium-ion batteries [112].
Extending beyond static prediction models, advanced machine learning algorithms have shown strong capabilities in forecasting SOC under dynamic and highly variable driving conditions typical of urban electric vehicle usage. Unlike classical empirical models, modern ML frameworks are adept at capturing the nonlinear, time-dependent interactions between current, voltage, and operational history. Their ability to incorporate temporal dependencies enables accurate SOC estimation during complex charge/discharge cycles encountered in real-world environments. Validated through multisine signal testing designed to emulate real-time driving loads, these approaches offer significant improvements over static parameterization techniques. As a result, they provide enhanced adaptability and forecasting precision for intelligent battery management systems, highlighting the growing role of ML in elevating energy management strategies across electric mobility platforms [113].

6.6. SOC Estimation Using LSTM and CNNs in Lithium-Ion Batteries

Hybrid deep learning architectures combining convolutional and recurrent neural networks have shown exceptional capability in improving lithium-ion battery SOC and RUL estimation, particularly under nonlinear and time-varying operational conditions. A notable contribution is the attention-based CNN-LSTM (A-CNN-LSTM) model, which integrates convolutional neural networks for feature extraction, LSTM units for temporal sequence modeling, and attention mechanisms to selectively emphasize critical data features. This architecture demonstrates marked improvements in SOC prediction, reducing RMSE, MAE, and mean absolute percentage error (MAPE) by 32.75%, 45.88%, and 53.36%, respectively, when compared to standalone CNN, LSTM, GRU, and earlier hybrid models. Its validation on real-world containerized energy storage systems highlights its applicability to practical battery management scenarios, where it effectively manages the complex electrochemical dynamics and internal variability of lithium-ion cells [114].
In parallel, the CNN-LSTM-ASAN framework offers a high-precision solution for RUL estimation by addressing key challenges such as capacity regeneration and signal noise interference. This architecture incorporates ICEEMDAN for denoising and Pearson correlation coefficient (PCC)-based component reconstruction to preprocess raw battery signals, followed by spatial-temporal feature extraction via CNN-LSTM fusion. An adaptive sparse attention mechanism further enhances feature weighting and model efficiency, resulting in RMSE values below 1.5% across a range of public and proprietary datasets. The model exhibits superior robustness, faster convergence, and strong generalization across battery chemistries, providing a reliable foundation for state monitoring and prognostic tasks. Its modular design facilitates future adaptation to additional battery health indicators, making it an extensible platform for intelligent battery management systems in real-world applications [115].

6.7. Deep Learning for SOC Prediction in Lithium-Ion Batteries

Recent studies underscore the potential of deep learning models in achieving high-precision SOC estimation while maintaining computational feasibility for real-world deployment in embedded BMS. A tri-layered, wide feedforward neural network (FFNN) has been shown to offer an optimal trade-off between accuracy and efficiency, outperforming both classical methods and more complex architectures such as gated recurrent unit recurrent neural networks (GRU-RNNs). Trained on a comprehensive dataset comprising over 835,000 records, the FFNN achieved a maximum estimation error below 1% and a mean squared error (MSE) of 1 × 10−8, exceeding the performance of Kalman filters and other neural network configurations with significantly lower computational burden. The study also challenges prevailing assumptions about the depth of deep learning models, showing that network width, rather than depth, yields superior performance for SOC estimation, thereby offering a scalable and practical solution for resource-constrained BMS applications [116].
In parallel, interpretable deep learning frameworks are gaining traction in battery state prediction tasks, particularly for RUL estimation. A recent approach integrates feature importance analysis with network pruning to identify and retain only the most critical aging indicators, resulting in a 0.19% increase in prediction accuracy, a 46.88% improvement in computational speed, and reduced processing overhead. This methodology preserves transparency through interpretability algorithms, ensuring that model decisions can be traced back to physical degradation patterns. The dual optimization of input data selection and neural network structure not only improves real-time performance but also provides actionable insights into battery aging mechanisms. Such interpretable AI models represent a promising path forward for intelligent battery diagnostics, enabling precise and explainable prognostics in compact BMS environments without compromising reliability [117].

7. State of Health Prediction Using AI for Lithium-Ion Batteries

Artificial intelligence-driven methods have emerged as powerful tools for predicting the SOH of lithium-ion batteries, particularly by exploiting temporal degradation patterns and data-rich operational profiles. One notable advancement is the Deep Cycle Attention Network (DCAN), which utilizes attention mechanisms to extract critical health indicators from streaming sensor data. By evaluating cycle-level similarities and dynamically weighting important time-series features, DCAN achieves superior SOH prediction accuracy and robustness across a range of datasets, including the NASA PCoE, MIT-Stanford, and Oxford driving cycle sets. Its attention-based architecture enables effective generalization under diverse conditions, outperforming conventional techniques such as support vector regression (SVR) and sample entropy (SampEN) by capturing nuanced degeneration signatures. These capabilities make DCAN a highly adaptable and reliable framework for real-time battery health monitoring in electric vehicles and energy storage systems [118].
Complementing these innovations, recent reviews highlight how deep learning models have transformed SOH and RUL prediction by surpassing traditional empirical and model-based approaches in capturing the nonlinear dynamics of battery degradation. Sophisticated architectures such as deep neural networks, convolutional-recurrent hybrids, and temporal feature-based learning models now allow for precise failure forecasting and optimized battery replacement planning across varied use cases. Emphasis on temporal dynamics, multivariate input integration, and robust feature engineering has proven critical for achieving high predictive performance. While challenges remain in interpretability and deployment, the growing body of evidence underscores deep learning as a transformative approach to SOH assessment and signals clear directions for future research focused on explainability and scalable real-world implementation [119].

7.1. SOH Estimation Using Machine Learning for Lithium-Ion Batteries

Machine learning-based SOH estimation for lithium-ion batteries has gained significant momentum due to its ability to capture degradation patterns using practical, data-efficient methods. A recent domain knowledge-guided framework demonstrates how carefully engineered health indicators derived from real-world electric vehicle usage can enable highly accurate capacity estimation with maximum absolute percentage errors ranging from 1.5% to 2.5%. This approach introduces five novel features, including power autocorrelation and energy-based metrics, which effectively characterize energy and power fade while being resilient to incomplete datasets. The framework’s computational simplicity and physical interpretability make it well-suited for real-time SOH monitoring under normal charging and driving cycles. Validation using vehicle-aged cell data confirms its viability for embedded health assessment systems in electric vehicles, establishing a balanced solution between accuracy and operational efficiency [120].
Complementary advancements in recurrent neural architecture further extend the capabilities of machine learning for SOH prediction. A GRU model, optimized with the whale optimization algorithm, achieves average estimation errors below 1% by focusing on minimal but informative voltage interval features. Using gray relational analysis and Spearman’s correlation to identify two key indicators from limited charging data, the approach eliminates the need for extensive preprocessing or large-scale datasets. This innovation ensures robustness across different battery chemistries and operational profiles, supporting generalization under sparse training data scenarios. The combined use of feature-efficient extraction and optimized recurrent modeling establishes a scalable and practical method for battery health diagnostics in real-world applications, especially where aging data are limited or inconsistently recorded [121].

7.2. SOH Prediction with Deep Learning Models for Lithium-Ion Batteries

Deep learning models have increasingly proven effective in accurately predicting lithium-ion battery SOH and RUL, offering enhanced precision and adaptability across a wide range of operational conditions. One of the most advanced frameworks in this domain is the AT-CNN-BiLSTM architecture, which integrates CNNs for hierarchical feature extraction, bidirectional long short-term memory (BiLSTM) networks for capturing forward and backward temporal dependencies, and attention mechanisms for dynamically prioritizing critical parameters. This hybrid model achieves outstanding performance, reporting R2 values exceeding 0.9910 and mean absolute percentage errors below 0.9003, with perfect absolute error scores observed in specific RUL estimations. Validated on diverse NASA battery datasets, the architecture demonstrates exceptional generalization when processing multidimensional operational data such as voltage, current, and temperature, significantly outperforming conventional LSTM, BiLSTM, and CNN-LSTM approaches. Future enhancements, such as incorporating bidirectional gated recurrent units (BiGRUs), may further improve computational efficiency without sacrificing accuracy, solidifying this model as a leading-edge solution for advanced battery diagnostics [122].
Complementing these deep learning strategies, GPR with an automatic relevance determination kernel offers a probabilistic approach to SOH prediction that excels in managing nonlinear degradation trends and sparse datasets. Achieving a mean absolute error of just 1.33% across three different lithium-ion cell types, the model effectively handles random load patterns and variable operating conditions, demonstrating low uncertainty in predictions even on unseen test data. The unified framework’s adaptability across cell chemistries without requiring retraining for each configuration makes it particularly suitable for real-world deployment scenarios where data may be inconsistent or limited. Although further refinement is needed to extend applicability under extreme conditions, the method’s combination of robustness, computational efficiency, and interpretability underscores its value for scalable, real-time battery health monitoring systems [123].

7.3. SOH Estimation with Neural Networks for Lithium-Ion Batteries

Deep learning has emerged as a dominant approach for SOH estimation in lithium-ion batteries, surpassing traditional machine learning techniques in capturing the complex nonlinearities associated with battery degradation under real-world operating conditions. Recent analyses of deep learning architectures highlight their ability to model intricate dependencies across diverse usage patterns, temperature variations, and inherent cell heterogeneity factors that significantly impact the accuracy and reliability of SOH prediction. These models offer robust generalization and improved adaptability, enabling more effective integration into battery management systems where accurate health monitoring is critical for ensuring safety, performance, and longevity. As electric vehicle adoption accelerates and energy storage demands increase, future developments are expected to emphasize larger-scale neural networks, advanced architecture innovations, and deployment-efficient frameworks to enhance both predictive fidelity and computational performance. This trajectory positions deep learning as a foundational technology for next-generation battery diagnostics, combining high estimation accuracy with practical feasibility for scalable, real-time applications [124].

7.4. SOH Prediction Using Random Forests for Lithium-Ion Batteries

Random Forest Regression (RFR) has emerged as a promising data-driven method for estimating the SOH of lithium-ion batteries, offering a balance between high predictive accuracy and computational efficiency. By utilizing key operational parameters such as voltage, current, temperature, and charge/discharge rates, RFR effectively models the complex nonlinear degradation behavior inherent in real-world electric vehicle applications. The approach consistently achieves mean absolute errors below 2%, outperforming conventional model-based and experimentally intensive methods, and demonstrates robustness across varying battery chemistries and usage profiles. In addition to its predictive capabilities, RFR facilitates the identification of critical features influencing SOH, providing actionable insights for BMS design and optimization. This interpretability, combined with its adaptability, lays the groundwork for future advancements in hybrid physics-informed AI models, transfer learning across battery types, and seamless integration with RUL prediction and smart charging strategies. As a result, Random Forest-based models contribute to enhancing the reliability, longevity, and sustainability of battery systems in modern electric vehicle fleets [125].

7.5. AI-Enhanced SOH Estimation Using Cloud Platforms for Lithium-Ion Batteries

Cloud-enabled artificial intelligence platforms are increasingly being leveraged to enhance SOH estimation for lithium-ion batteries, particularly in scenarios where direct data acquisition is limited or cost-prohibitive. A notable advancement in this area is the application of generative learning, which addresses data scarcity challenges by synthetically generating representative datasets for SOH prediction. Validated on a large-scale dataset comprising 2700 retired lithium-ion batteries with diverse cathode chemistries, form factors, and operational histories, the generative learning framework achieves mean absolute percentage errors below 6% even for previously unseen state-of-charge levels. This synthetic data-driven approach circumvents the need for exhaustive physical testing, enabling scalable and accurate SOH assessments across heterogeneous battery inventories. Beyond its technical merits, the method offers substantial environmental and economic advantages, with projected global savings of USD 4.9 billion in electricity costs and 35.8 billion kg of CO2 emissions by 2030 through reduced energy consumption in data collection. As a result, generative learning represents a transformative solution for integrating AI-enhanced SOH estimation into cloud-based battery reuse and recycling infrastructures, promoting sustainable lifecycle management in the context of rapidly growing volumes of retired battery systems [126].

7.6. SOH Monitoring Using Edge AI Systems for Lithium-Ion Batteries

Edge AI systems are emerging as critical enablers for real-time, energy-efficient SOH monitoring in lithium-ion batteries, particularly in electric vehicle applications where responsiveness and resource constraints must be balanced. A recent end-cloud collaborative framework exemplifies this approach by integrating edge-side empirical models with cloud-based deep learning to achieve SOH estimation with root mean square errors as low as 1%. The system leverages a bidirectional long short-term memory (Bi-LSTM) network with spatial-temporal attention in the cloud to extract key degradation signatures from incremental capacity and differential thermal data, while the edge component employs a lightweight exponential model fused with an Extended Kalman Filter for local, low-latency estimation. This dual-layer architecture supports flexible update rates and minimizes bandwidth demands through selective data synchronization, allowing for scalable deployment under varied hardware and network constraints. Moreover, Bayesian hyperparameter optimization ensures efficient computational performance across both tiers. Although current implementations focus on individual cells, the framework demonstrates high adaptability and scalability, with clear potential for extension to pack-level diagnostics, provided future enhancements address inter-cell variability. As such, this collaborative edge-cloud paradigm marks a significant step toward comprehensive and intelligent battery lifecycle management solutions [127].

7.7. Lithium-Ion Battery Aging Dynamics with AI

AI has proven instrumental in elucidating the aging dynamics of lithium-ion batteries by enabling accurate SOH estimation under complex, real-world operating conditions. A notable advancement in this domain is the development of a PINN framework that combines empirical degradation models with deep learning to achieve exceptional accuracy (MAPE = 0.87%) across diverse battery chemistries and operational profiles. Validated on over 310,000 samples from 387 batteries, the approach employs a novel feature extraction technique targeting constant-current/constant-voltage charging segments, thereby minimizing information leakage and circumventing dataset-specific feature engineering. The integration of physical insights not only enhances interpretability but also enables strong generalization in small-sample and transfer learning settings. This establishes PINNs as a robust methodology that bridges electrochemical modeling and data-driven learning for scalable, interpretable battery health diagnostics [9].
In parallel, accelerated aging effects under high C-rate cycling conditions have been effectively modeled using deep learning architectures. A comparative study revealed that LSTM-RNN networks outperform traditional FNNs in predicting cycle life and SOH at elevated discharge rates, particularly above 1.3 C, where capacity fade is significantly intensified. This model captures the nonlinear degradation behavior associated with fast cycling and supports more precise SOH tracking and lifetime estimation. Moreover, its predictive reliability under accelerated conditions offers practical utility for electric vehicle applications, informing strategies that mitigate degradation while supporting second-life viability for moderately aged cells. Together, these AI-enhanced modeling frameworks offer essential insights for optimizing battery usage and extending operational life in both primary and secondary energy storage contexts [128].

7.8. Predicting SOH Degradation in Lithium-Ion Batteries with Machine Learning

Classical and hybrid machine learning approaches continue to play a pivotal role in enhancing SOH prediction for lithium-ion batteries by offering interpretable and computationally efficient solutions for battery degradation modeling. Among classical methods, SVR and multiple linear regression (MLR) have demonstrated reliable performance, particularly when applied to features derived from partial charging time (PCT) intervals. Notably, charging durations between 3.8 V and 4.0 V serve as robust indicators of capacity fade, revealing strong linear correlations with aging under consistent charge protocols. Optimal model performance is achieved using a minimal feature set, often just one or two carefully selected voltage-based features, while broader intervals below 3.7 V degrade predictive accuracy due to diminished feature variance. SVR consistently outperforms alternatives, including random forest and polynomial regression, across various data scales, highlighting its suitability for lightweight SOH estimation systems in real-world battery management applications [129].
Building on these foundations, the integration of scientific machine learning (SciML) with physical domain knowledge has led to advanced hybrid frameworks that significantly improve the accuracy and interpretability of battery degradation forecasting. A notable implementation combines neural ordinary differential equations (NeuralODE) and universal differential equations (UDE) to capture both short-term health indicators and long-term degradation trends, achieving superior predictive performance with mean squared errors as low as 2.49. By aligning neural network predictions with electrochemical principles, this approach reduces data dependency while preserving physical realism, making it well-suited for real-world electric vehicle deployments. The scalability and scientific rigor of SciML methodologies support their application in sustainable energy management systems, contributing to battery life extension, improved performance optimization, and broader climate mitigation strategies aligned with global sustainability goals [130].

7.9. AI-Assisted SOH Modeling for Lithium-Ion Batteries

Artificial intelligence continues to drive innovation in lithium-ion battery SOH modeling, with recurrent neural network (RNN)-based architectures demonstrating significant advances in both precision and applicability. A recently developed RNN encoder-decoder model achieves exceptional accuracy, with root mean square errors of 0.669% and 0.338% under single and multiple pulse test conditions, respectively. This approach distinguishes itself by leveraging features derived from battery models instead of raw experimental data, enhancing robustness and reliability over traditional algorithms such as XGBoost. Validated on a comprehensive five-year aging dataset of commercial NMC batteries, the model offers practical utility in real-time SOH estimation across a wide state-of-charge range using short-duration pulse response data, thereby supporting safer and more efficient battery management strategies in operational settings [131].
Complementing this, broader developments in neural network (NN) architectures have reinforced their role as transformative tools for battery health monitoring. Backpropagation networks, CNNs, and LSTM networks have all demonstrated superior capability in extracting meaningful health features from diverse battery parameter inputs. These models provide enhanced scalability, energy efficiency, and generalization across battery types and operational conditions, surpassing traditional methods in accuracy and operational resilience. Continued integration of field data, improved feature selection, and refined parameter optimization are expected to further strengthen the scientific rigor and stability of NN-driven SOH forecasting. Collectively, these advancements solidify neural networks as a foundational methodology for next-generation predictive maintenance and lifecycle management systems in lithium-ion battery applications [132].

8. Thermal Control in Lithium-Ion Batteries Using AI

Thermal control remains a pivotal component of LIB management, particularly in electric vehicles and stationary energy storage systems, where temperature uniformity directly impacts performance, safety, and lifespan. Recent innovations in thermal management strategies combine advanced materials such as PCMs with enhanced thermal conductivity and artificial intelligence-enabled control systems to address persistent challenges, including localized overheating, thermal gradients, and degradation acceleration. Among these, AI-driven BMS equipped with deep reinforcement learning models (e.g., MSCC-DRL) offer dynamic adaptability for real-time thermal optimization, supporting ultra-fast charging protocols and operation under extreme conditions without compromising energy density [133].
Artificial neural networks have further strengthened intelligent thermal management by enabling precise prediction of key battery states such as SOC, SOH, and RUL. A variety of ANN architectures, including feedforward, convolutional, and recurrent networks, have demonstrated the ability to model complex electrochemical-thermal interactions, outperforming conventional techniques in fault detection and safety-critical event anticipation. These learning-based methods enhance BMS reliability and responsiveness by bridging data-driven predictions with actionable control strategies. However, despite their promise, further research is necessary to improve model generalization, ensure robustness under diverse operating conditions, and integrate scalable, cost-effective hardware implementations for real-world adoption in electric mobility and grid applications [134].

8.1. Lithium-Ion Battery Thermal Management Systems Powered by AI

Artificial intelligence has become a critical enabler in the advancement of lithium-ion BTMS, offering efficient, scalable alternatives to computationally intensive physics-based simulations. By leveraging machine learning algorithms, modern BTMS frameworks can accurately predict temperature distribution, thermal gradients, and fire risk in real time, enabling more responsive and energy-efficient system control. ML-driven models incorporate a wide range of design parameters, including cell geometry, cooling medium characteristics, heat generation rates, and thermal boundary conditions, to optimize thermal performance across varying operational states. This data-driven approach not only enhances system-level thermal stability but also supports the integration of digital twin architectures, where continuous real-time monitoring informs predictive diagnostics and adaptive control strategies. Beyond temperature regulation, these AI-enhanced BTMS solutions form a foundational layer for comprehensive battery safety management by enabling early detection of abnormal thermal behavior and facilitating preventive countermeasures in both electric vehicle and stationary energy storage systems [135].

8.2. AI for Thermal Runaway Prevention in Lithium-Ion Batteries

Preventing thermal runaway in lithium-ion batteries necessitates a multifaceted strategy that integrates intrinsic material-level safeguards with intelligent external management systems. Recent advances in cell-level safety mechanisms, such as positive temperature coefficient (PTC) electrodes, thermoresponsive separator coatings, and thermally polymerizable additives, have shown promising capabilities in interrupting ion and electron transport when internal temperatures exceed critical thresholds. These innovations enable intrinsic circuit-breaking responses that reduce reliance on external controls while preserving electrochemical performance during normal operation. Despite their potential, challenges remain in fine-tuning response temperatures, minimizing trade-offs in cell efficiency, and scaling these technologies for commercial applications in electric vehicles and stationary storage systems [136]. Complementing these internal protections are external thermal mitigation strategies such as mist-based cooling systems, which offer efficient, low-energy heat dissipation, and AI-enhanced battery management systems capable of early detection through predictive internal temperature modeling. Hybrid liquid-solid electrolytes and self-healing polymers further enhance structural resilience, reducing vulnerability to thermal instability. Together, these developments establish a comprehensive and scalable framework for mitigating thermal runaway by combining real-time diagnostic capabilities with fundamental design innovations, ultimately enabling safer, high-performance lithium-ion battery systems aligned with the stringent demands of electric mobility and sustainable energy storage [137].

8.3. Thermal Modeling of Lithium-Ion Batteries with AI

Thermal modeling is critical for ensuring the safety and performance of lithium-ion batteries, particularly under high-power or aviation-specific conditions. Recent advancements demonstrate that nanofluid-based thermal management systems, such as those employing Al2O3-enhanced coolants, offer superior heat dissipation capabilities compared to traditional air cooling methods. These systems have been shown to effectively maintain prismatic battery modules within optimal operating temperatures across varied discharge rates, nanofluid concentrations, and inlet velocities, significantly reducing thermal-induced capacity degradation in high-density hybrid electric aircraft applications [138]. Complementing these empirical strategies, thermal-electrochemical models that incorporate temperature-dependent parameters such as ionic conductivity, salt diffusion coefficients, and thermodynamic factors enable precise prediction of discharge behavior across real-world operational temperature ranges. Validated against experimental data from pouch cells with LiCoO2 cathodes and mesocarbon microbead anodes, such models accurately simulate the thermal-electrochemical interactions during cycling, providing key insights into the impact of thermal gradients on lithium-ion cell performance [139]. Together, these approaches lay the groundwork for AI-enhanced thermal management systems capable of predicting and controlling temperature dynamics under diverse operating conditions, supporting safer and more efficient battery integration in next-generation electric mobility and aerospace platforms.

8.4. Heat Management Systems for Lithium-Ion Batteries Using AI

The fusion of AI with advanced heat management systems has significantly enhanced thermal stability and safety in lithium-ion batteries, particularly for electric vehicle applications. Mahadevan and Vikraman [140] propose a hybrid architecture combining LSTM-controlled DC-DC converters with ultra-capacitors and a fractional-order PID (FOPID)-based thermal management system. This dual-stage framework achieves 98% operational accuracy and complete fault detection specificity while maintaining precise battery temperature control via adaptive heating, refrigeration, or radiator modules, thus extending battery life under dynamic load conditions. Complementing this, Lyu et al. [141] integrate AI optimization with a novel electro-thermal and boiling heat transfer model for immersion-cooled pouch cells. Their findings identify coolant density, viscosity, and specific heat capacity as the most influential parameters for thermal runaway mitigation. AI algorithms optimize boiling surface heat flux and coolant properties, validating immersion cooling as a viable TR suppression strategy and highlighting the role of AI in accelerating coolant selection and system design. Similarly, Ramasamy [142] employs Random Forest and Gradient Boosting algorithms to accurately predict thermal performance in cylindrical cell-based battery thermal management systems. Their results underscore the influence of battery spacing on heat regulation and demonstrate that AI can uncover complex thermal interactions, guiding the design of more efficient and adaptive BTMS architectures. Collectively, these works establish AI-enhanced thermal control as a transformative paradigm in next-generation battery management, enabling proactive thermal regulation, safer operation, and optimized system design.

8.5. AI-Driven Thermal Optimization in Lithium-Ion BMS

Advanced AI integration into lithium-ion battery thermal management systems offers substantial improvements in thermal uniformity, energy efficiency, and system responsiveness. Zhuang et al. [143] developed an intelligent BTMS incorporating hollow spoiler prisms, reciprocating airflow, and fuzzy model predictive control to dynamically optimize thermal performance in electric vehicle packs. Their system achieved a 76.4% reduction in energy consumption and significantly reduced temperature non-uniformity from 1.5 °C to 0.6 °C while maintaining cell temperatures within optimal operating ranges. The integration of AI-based control with structural flow optimization underscores the effectiveness of combining real-time control strategies with mechanical design enhancements. In a broader context, Briggt and Johansson [144] emphasized the transformative role of deep learning and machine learning techniques in BMS, enabling precise state-of-charge estimation, dynamic thermal management, and early fault detection. These AI-driven frameworks support adaptive charging protocols and real-time thermal mitigation, enhancing both performance and battery longevity. Collectively, these innovations reflect a paradigm shift toward smart, energy-efficient battery management systems that dynamically balance safety, performance, and durability through intelligent, data-driven optimization.

8.6. Edge AI for Real-Time Thermal Runaway Prevention in Lithium-Ion Batteries

Enhancing real-time safety in lithium-ion battery systems requires innovations that combine advanced materials, intelligent edge computing, and effective thermal containment strategies. Song et al. [145] introduced a roll-to-roll manufactured safety reinforced layer (SRL) composed of engineered polythiophene and carbon additives, which autonomously interrupts current flow during internal short circuits, thereby significantly reducing the risk of thermal runaway. The scalable SRL fabrication process (5 km/day) enables practical commercial deployment, with impact testing on 3.4 Ah pouch cells showing a dramatic reduction in explosion rates from 63% to 10%. The SRL’s performance during nail penetration and impact scenarios validates its potential as a frontline safety mechanism in battery packs. Complementing this materials-level solution, Xu et al. [146] demonstrated that mini-channel cooling systems, though limited in halting thermal runaway within a single cell, play a vital role in preventing thermal propagation across adjacent cells. Their simulations revealed that factors such as coolant flow rate and abuse reaction kinetics determine the onset of failure, but mini-channel structures can effectively isolate and contain cascading effects. Together, these findings support the integration of edge AI-enabled safety systems with material and thermal design innovations to create resilient, real-time thermal runaway prevention strategies crucial for electric vehicle battery safety and lifecycle reliability.

8.7. AI for Thermal Control in Large-Scale Lithium-Ion Battery Systems

Advanced thermal management strategies are essential for ensuring the safe and efficient operation of large-scale lithium-ion battery ESS, especially under high-capacity configurations and constrained spatial layouts. Ye and Arıcı [147] proposed an optimized forced-air cooling design for a seven-level cabinet ESS, demonstrating that specific geometric parameters such as a 3.0 mm airflow channel width, 10 mm inter-module spacing, and a 20 mm gap between the top module and fans can maintain maximum temperature rise below 4.63 K and ensure uniformity below 2.82 K. Their numerical model, validated with only 3.2% deviation from experimental results, confirms that precise airflow configuration enables thermal compliance (<313.15 K) while maximizing structural compactness for battery integration.
In parallel, Zhang et al. [148] introduced a liquid cooling framework employing streamlined-channel plates tailored for large-format pouch cells. Their study identifies mass flow rate, cooling trigger-time, and glycol concentration as critical control variables. While increased flow rates enhance cooling performance, they also exhibit diminishing thermal returns; conversely, strategic delays in cooling activation improve energy efficiency at the cost of minor uniformity trade-offs. Higher glycol content, while boosting cooling capacity, results in elevated maximum temperatures and higher pressure drops. Experimental validation confirms that the proposed liquid cooling design reliably sustains cell temperatures within the optimal operating window. Together, these findings highlight the role of AI-enhanced modeling and optimization in scaling thermal control strategies for large-format lithium-ion battery systems deployed in grid storage and high-demand mobility applications.

8.8. Machine Learning for Lithium-Ion Battery Temperature Regulation

The application of machine learning, particularly deep learning, has significantly advanced BTMS for electric vehicles by enabling real-time temperature prediction, spatial monitoring, and adaptive control. Qi et al. [149] highlighted the distinct strengths of various neural network architectures: CNNs excel in spatial thermal monitoring critical for preventing thermal runaway, while RNNs and Transformer models are more effective in capturing long-term temporal dynamics, despite higher computational costs. The integration of attention mechanisms and generative adversarial networks (GANs) further enhances performance under data-limited scenarios and improves model adaptability. However, challenges remain in reducing model data dependency, increasing computational efficiency, and improving interpretability. Future advancements are expected to center on developing lightweight, multi-physics informed hybrid models that combine deep learning flexibility with physical laws to enhance generalization and real-world applicability in BTMS design.
Complementing these architectural developments, Bacak [150] demonstrated the integration of computational fluid dynamics (CFD) and ANNs using a Levenberg–Marquardt optimization algorithm to accurately predict battery surface temperatures under a wide range of operating conditions. This hybrid modeling approach addresses critical thermal safety concerns, identifying conditions where inadequate cooling can elevate temperatures to hazardous levels (50–80 °C), increasing the risk of thermal events. The ANN model, trained on simulation data, achieved exceptional predictive accuracy with a mean squared error of 0.00552 and R2 of 0.99, offering a highly efficient alternative to time-intensive CFD simulations. Together, these approaches underscore the growing role of AI in shaping the next generation of BTMS, enabling safer, faster, and more adaptive thermal regulation strategies tailored for high-performance electric vehicles. Table 4 consolidates AI-driven thermal control strategies for Li-ion batteries, including predictive modeling, thermal runaway prevention, and scalable heat management approaches.

9. Practical Application Cases of AI and Digital Twins in Li-Ion BMS

The application of AI and digital twin technologies in Li-ion BMS has progressed from laboratory validation to industrial deployment, demonstrating tangible benefits while revealing key limitations. For example, Tesla employs ML models within its BMS to enhance SOC and SOH estimation, leveraging fleet data to refine management under diverse operating conditions. This approach enables more accurate range prediction and improves thermal management during fast charging and high-power operation [151,152]. Similarly, CATL and BYD utilize digital twin models for predictive maintenance and lifecycle optimization, enabling real-time monitoring and anomaly detection across large-scale EV fleets. By creating digital replicas of battery systems, these companies simulate usage scenarios, predict degradation, and schedule maintenance effectively, reducing downtime and improving fleet reliability [153,154]. Overall, these applications underscore the viability of AI and digital twins in BMS while highlighting the need to address data standardization, model interpretability, and network security to enable broader deployment in next-generation EVs.

10. Summary

The review analyzes how the use of AI and DT technologies improves the state estimation, cycle life, and scaling control of BMS in lithium-ion batteries. By studying more than 120 recent studies, the authors investigate how AI systems such as deep learning, highlight modeling, and fusion techniques can be used to perform accurate predictions of SOC, SOH, and RUL. Having an improved predictive performance, some of these AI models manage SOC errors as low as 0.41% and SOH estimation RMSE as low as 0.4%. The report also shows that fast-charging, finding problems, and keeping battery temperatures stable are improving with the help of advanced reinforcement learning and control.
With digital twins, the paper proposes standard architectures based on real-time sensing, cloud-edge computing, artificial intelligence, and simulation tools to aid adaptive diagnostics, predictive maintenance, and smart life-cycle handling of equipment. With DT integrated, these systems have managed to reduce maintenance charges by as much as 60%, improve how long the battery remains useful by 15% and handled safety and anomaly detection better. Hybrid AI-DT solutions and real-time cloud-edge frameworks are currently helping electric vehicles, smart grids, and renewable storage to scale better. Problems in data standardization, protection from cyber threats, instant processing, and understanding models still exist, even after much progress has been made. By the close, the paper emphasizes the important role AI and DT share in helping BMS become more intelligent, resilient, and in line with worldwide efforts toward sustainability and electrification.

11. Conclusions

AI has significantly transformed battery management, modeling, and lifecycle optimization, particularly within lithium-ion battery systems. This review has systematically examined various applications of AI in batteries, including state estimation (SOC, SOH, RUL), thermal management, fault diagnosis, materials discovery, and sustainability. By comparing approaches such as CNNs, LSTMs, and emerging physics-informed AI models, we demonstrate that AI offers superior capabilities in addressing the limitations of conventional battery management systems.
BiLSTM with dual-attention returned SOH estimation errors of RMSE 0.4% and MAE 0.3%, while the CNN-GRU-LSTM model estimated SOC with 0.41% error and a processing speed of 0.000113 s/sample. With R2 exceeding 0.9999 and RMSE and MAE reaching 3.782 and 2.099, respectively, Bagging Regressor, Random Forest, XGBoost, and LightGBM demonstrated the highest accuracy in predicting remaining useful life. Transformer-based PatchFormer networks achieved an RUL prediction RMSE of under 0.9%, and temperature-adaptive GRU frameworks recorded an SOC prediction MAE of 2.15%. The CNN-Transformer with SRUKF approach improved SOC estimation error by more than 30% under low-temperature battery conditions.
Research utilizing digital twin systems has achieved state-of-charge estimation errors below 0.14%, reduced maintenance costs by up to 60%, and extended battery lifespan by 15%. Specific frameworks have improved anomaly detection accuracy by 95%, increased power efficiency by 15%, and reduced capacity fade by 0.0025% per charge cycle. The combination of improved gray wolf optimization with AdaBoost yielded MAE and RMSE values as low as 0.01. Type-2 FCMNN models based on fuzzy logic reduced MAE and RMSE by 43.1% and 36.0%, respectively. A hybrid methodology incorporating both experimental results and simulation data achieved a 68.42% reduction in capacity estimation error and over a 99% decrease in experimental workload.
AI techniques have enhanced the performance, adaptability, and efficiency of battery monitoring and control systems under diverse operating conditions. Furthermore, the integration of cloud and edge computing with digital twin technology enables predictive maintenance, real-time optimization, large-scale fleet management, and reduced device-level computational load. The incorporation of explainable AI, federated learning, and reinforcement learning contributes to the development of intelligent, secure, and decentralized battery networks.
Several critical challenges remain, including enabling models to function with limited data, enhancing their interpretability, validating them in real-world applications, and addressing cybersecurity and standardization concerns. Future research should integrate data-driven approaches with physical models, ensure broad applicability across platforms, and focus on developing batteries with durability spanning both production and recycling phases.
This information supports researchers and engineers in developing environmentally sustainable and efficient battery technologies. The integration of AI-driven modeling, advanced control strategies, and digital tools facilitates the design of safer, longer-lasting, and cleaner energy storage systems.

Author Contributions

Conceptualization, S.S.M. and Y.S.; methodology, S.S.M. and Y.S.; software, S.S.M.; validation, S.S.M. and Y.S.; formal analysis, S.S.M.; investigation, S.S.M. and Y.S.; resources, M.F. and H.C.; data curation, S.S.M.; writing-original draft preparation, S.S.M. and Y.S.; writing-review and editing, M.F., S.P., H.C., S.M., S.X.D. and K.S.; visualization, S.S.M.; supervision, H.C., M.F. and S.M.; project administration, H.C.; funding acquisition, H.C., S.M. and S.X.D. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

ANFISAdaptive Neuro-Fuzzy Inference System
AEKFAdaptive Extended Kalman Filter
AIArtificial Intelligence
ANNsArtificial Neural Networks
BMSBattery Management Systems
BTMSBattery Thermal Management System
BiLSTMBi-Directional LSTM
CBMSCloud-Based BMS
CEBMSCloud-Edge Battery Management System
CNNConvolutional Neural Network
CANController Area Network
DaaSData-as-a-Service
DNNsDeep Neural Networks
DTVDifferential Thermal Voltammetry
DTDigital Twin
DPANDual Patch-Wise Attention Network
DSTDynamic Stress Test
EISElectrochemical Impedance Spectroscopy
EVElectric Vehicles
EMSEnergy Management Systems
ESSEnergy Storage Systems
ECMEquivalent Circuit Model
EKFExtended Kalman Filtering
XMLExplainable Machine Learning
XGBoostExtreme Gradient Boosting
FANFeature-Wise Attention Network
FNNFeedforward Neural Networks
FLCFuzzy Logic Control
GASVRGenetic Algorithm-Optimized Support Vector Regression
GRUGated Recurrent Unit
GPRGaussian Process Regression
GROGold Rush Optimizer
HSE-DTHierarchical and Self-Evolving Digital Twin
HPPCHybrid Pulse Power Characterization
IIoTIndustrial Internet-of-Things
IaaSInfrastructure-as-a-Service
IBMSIntelligent Battery Management System
ICVsIntelligent Connected Vehicles
IoTInternet-of-Things
LCALife Cycle Assessment
LIBsLithium-Ion Batteries
LSTMLong Short-Term Memory
MLMachine Learning
m-AICMinimized Akaike Information Criterion
MWLSMoving Window Least Squares
MNNMulti-Layer Neural Networks
PSOParticle Swarm Optimization
PCMsPhase Change Materials
PBMsPhysics-Based Models
PINNPhysics-Informed Neural Network
PaaSPlatform-as-a-Service
PHMPrognostics and Health Management
RULRemaining Useful Life
RMSEsRoot Mean Square Errors
SaaSSoftware-as-a-Service
SOCState of Charge
SOHState of Health
SOEState-of-Energy
SOFState-of-Function
SVMSupport Vector Machine
TSFMTakagi-Sugeno Fuzzy Model
SRUKFThe Square Root Unscented Kalman Filter
TS-GANTime-Series Generative Adversarial Network
Type-2 FCMNNType-2 Fuzzy Cerebellar Model Neural Network
UAVsUnmanned Aerial Vehicles
UKFUnscented Kalman Filters
FFNNWide Feedforward Neural Network
WLTPWorldwide Harmonized Light Vehicle Test Procedure

References

  1. Moradizadeh, L.; Madhavan, P.V.; Ozden, A.; Li, X.; Shahgaldi, S. Advances in Protective Coatings for Porous Transport Layers in Proton Exchange Membrane Water Electrolyzers: Performance and Durability Insights. Energy Convers. Manag. 2025, 332, 119713. [Google Scholar] [CrossRef]
  2. Madhavan, P.V.; Moradizadeh, L.; Shahgaldi, S.; Li, X. Data-Driven Modelling of Corrosion Behaviour in Coated Porous Transport Layers for PEM Water Electrolyzers. Artif. Intell. Chem. 2025, 3, 100086. [Google Scholar] [CrossRef]
  3. Rahmani, P.; Chakraborty, S.; Mele, I.; Katrašnik, T.; Bernhard, S.; Pruefling, S.; Wilkins, S.; Hegazy, O. Driving the Future: A Comprehensive Review of Automotive Battery Management System Technologies, and Future Trends. J. Power Sources 2025, 629, 235827. [Google Scholar] [CrossRef]
  4. Shabeer, Y.; Madani, S.S.; Panchal, S.; Mousavi, M.; Fowler, M. Different Metal–Air Batteries as Range Extenders for the Electric Vehicle Market: A Comparative Study. Batteries 2025, 11, 35. [Google Scholar] [CrossRef]
  5. Su, L.; Xu, Y.; Dong, Z. State-of-health Estimation of Lithium-ion Batteries: A Comprehensive Literature Review from Cell to Pack Levels. Energy Convers. Econ. 2024, 5, 224–242. [Google Scholar] [CrossRef]
  6. Wang, S.; Zhou, R.; Ren, Y.; Jiao, M.; Liu, H.; Lian, C. Advanced Data-Driven Techniques in AI for Predicting Lithium-Ion Battery Remaining Useful Life: A Comprehensive Review. Green Chem. Eng. 2025, 6, 139–153. [Google Scholar] [CrossRef]
  7. Ghalkhani, M.; Habibi, S. Review of the Li-Ion Battery, Thermal Management, and AI-Based Battery Management System for EV Application. Energies 2023, 16, 185. [Google Scholar] [CrossRef]
  8. Fan, Y.; Lin, Z.; Wang, F.; Zhang, J. A Hybrid Approach for Lithium-Ion Battery Remaining Useful Life Prediction Using Signal Decomposition and Machine Learning. Sci. Rep. 2025, 15, 8161. [Google Scholar] [CrossRef] [PubMed]
  9. Wang, F.; Zhai, Z.; Zhao, Z.; Di, Y.; Chen, X. Physics-Informed Neural Network for Lithium-Ion Battery Degradation Stable Modeling and Prognosis. Nat. Commun. 2024, 15, 4332. [Google Scholar] [CrossRef] [PubMed]
  10. Keyhani-Asl, A.; Perera, N.; Lahr, J.; Hasan, R. Innovative Hybrid Battery Thermal Management System Incorporating Copper Foam Porous Fins and Layers with Phase Change Material and Liquid Cooling. Appl. Therm. Eng. 2025, 268, 125848. [Google Scholar] [CrossRef]
  11. Madani, S.S.; Shabeer, Y.; Allard, F.; Fowler, M.; Ziebert, C.; Wang, Z.; Panchal, S.; Chaoui, H.; Mekhilef, S.; Dou, S.X.; et al. A Comprehensive Review on Lithium-Ion Battery Lifetime Prediction and Aging Mechanism Analysis. Batteries 2025, 11, 127. [Google Scholar] [CrossRef]
  12. Madani, S.S.; Allard, F.; Shabeer, Y.; Fowler, M.; Panchal, S.; Ziebert, C.; Mekhilef, S.; Dou, S.X.; See, K.; Wang, Z. Exploring the Aging Dynamics of Lithium-Ion Batteries for Enhanced Lifespan Understanding. J. Phys. Conf. Ser. 2025, 2968, 012017. [Google Scholar] [CrossRef]
  13. Njoku, J.N.; Nkoro, C.; Medina, R.M.; Ifeanyi Nwakanma, C.; Lee, J.-M.; Kim, D.-S. Leveraging Digital Twin Technology for Battery Management: A Case Study Review; Leveraging Digital Twin Technology for Battery Management: A Case Study Review. IEEE Access 2025, 13, 21382–21412. [Google Scholar] [CrossRef]
  14. Naseri, F.; Gil, S.; Barbu, C.; Cetkin, E.; Yarimca, G.; Jensen, A.C.; Larsen, P.G.; Gomes, C. Digital Twin of Electric Vehicle Battery Systems: Comprehensive Review of the Use Cases, Requirements, and Platforms. Renew. Sustain. Energy Rev. 2023, 179, 113280. [Google Scholar] [CrossRef]
  15. Ren, T.; Wu, X.; Wang, D.; Ma, X.; Cai, B.; Baskoro, F.; Zou, B.; Kim, J.; Ge, B.; Zhang, Q.; et al. Sustainable Recovery Progress of Ternary Cathodes in Lithium-Ion Batteries in the Artificial Intelligence Era. Mater. Today Energy 2025, 49, 101844. [Google Scholar] [CrossRef]
  16. Lipu, M.S.H.; Miah, S.; Jamal, T.; Rahman, T.; Ansari, S.; Rahman, S.; Ashique, R.H.; Shihavuddin, A.S.M.; Shakib, M.N. Artificial Intelligence Approaches for Advanced Battery Management System in Electric Vehicle Applications: A Statistical Analysis towards Future Research Opportunities. Vehicles 2023, 6, 22–70. [Google Scholar] [CrossRef]
  17. Yavas, U.; Kurtulus, C.; Genc, U. Battery Management with AI for Better and Safer Batteries. ATZelectronics Worldw. 2024, 19, 8–13. [Google Scholar] [CrossRef]
  18. Pooyandeh, M.; Sohn, I. Smart Lithium-Ion Battery Monitoring in Electric Vehicles: An AI-Empowered Digital Twin Approach. Mathematics 2023, 11, 4865. [Google Scholar] [CrossRef]
  19. Palanichamy, K.; Soni, J. Optimizing Electric Vehicle Performance with AI-Driven Battery Management Systems. Educ. Adm. Theory Pract. 2024, 30, 2519–2532. [Google Scholar] [CrossRef]
  20. Farman, M.K.; Nikhila, J.; Sreeja, A.B.; Roopa, B.S.; Sahithi, K.; Gireesh Kumar, D. AI-Enhanced Battery Management Systems for Electric Vehicles: Advancing Safety, Performance, and Longevity. E3S Web Conf. 2024, 591, 040101. [Google Scholar] [CrossRef]
  21. Ahwiadi, M.; Wang, W. An AI-Driven Particle Filter Technology for Battery System State Estimation and RUL Prediction. Batteries 2024, 10, 437. [Google Scholar] [CrossRef]
  22. Vashist, D.; Raj, R.; Sharma, D. Artificial Intelligence in Electric Vehicle Battery Management System: A Technique for Better Energy Storage; 19 September 2024. Available online: https://www.sae.org/publications/technical-papers/content/2024-28-0089/ (accessed on 16 July 2025).
  23. Suanpang, P.; Jamjuntr, P. Optimal Electric Vehicle Battery Management Using Q-Learning for Sustainability. Sustainability 2024, 16, 7180. [Google Scholar] [CrossRef]
  24. Razmjoo, A.; Ghazanfari, A.; Østergaard, P.A.; Jahangiri, M.; Sumper, A.; Ahmadzadeh, S.; Eslamipoor, R. Moving Toward the Expansion of Energy Storage Systems in Renewable Energy Systems—A Techno-Institutional Investigation with Artificial Intelligence Consideration. Sustainability 2024, 16, 9926. [Google Scholar] [CrossRef]
  25. Miraftabzadeh, S.M.; Longo, M.; Di Martino, A.; Saldarini, A.; Faranda, R.S. Exploring the Synergy of Artificial Intelligence in Energy Storage Systems for Electric Vehicles. Electronics 2024, 13, 1973. [Google Scholar] [CrossRef]
  26. Badran, M.A.; Toha, S.F. Employment of Artificial Intelligence (AI) Techniques in Battery Management System (BMS) for Electric Vehicles (EV): Issues and Challenges. Pertanika J. Sci. Technol. 2024, 32, 859–881. [Google Scholar] [CrossRef]
  27. Acharya, S.; Viswesh, P.; Sridhar, M.K.; Pathak, A.D.; Sharma, H.; Nazir, A.; Kasbe, A.; Sahu, K.K. Artificial Intelligence and Machine Learning in Battery Materials and Their Applications. In Nanostructured Materials Engineering and Characterization for Battery Applications; Elsevier: Amsterdam, The Netherlands, 2024; pp. 639–676. [Google Scholar]
  28. Oyucu, S.; Ersöz, B.; Sağıroğlu, Ş.; Aksöz, A.; Biçer, E. Optimizing Lithium-Ion Battery Performance: Integrating Machine Learning and Explainable AI for Enhanced Energy Management. Sustainability 2024, 16, 4755. [Google Scholar] [CrossRef]
  29. Feng, G. Transformation of Lithium Battery Material Design and Optimization Based on Artificial Intelligence. Acad. J. Mater. Chem. 2024, 5, 18–22. [Google Scholar] [CrossRef]
  30. Wang, X.T.; Wang, J.S.; Zhang, S.B.; Liu, X.; Sun, Y.C.; Shang-Guan, Y.P. Capacity Prediction Model for Lithium-Ion Batteries Based on Bi-Directional LSTM Neural Network Optimized by Adaptive Convergence Factor Gold Rush Optimizer. Evol. Intell. 2025, 18, 35. [Google Scholar] [CrossRef]
  31. Madhavan, P.V.; Shahgaldi, S.; Li, X. Long Short-Term Memory Time Series Modelling of Pressure Valves for Hydrogen-Powered Vehicles and Infrastructure. Int. J. Hydrogen Energy 2025, 124, 67–83. [Google Scholar] [CrossRef]
  32. Alzamer, H.; Jaafreh, R.; Kim, J.-G.; Hamad, K. Artificial Intelligence and Li Ion Batteries: Basics and Breakthroughs in Electrolyte Materials Discovery. Crystals 2025, 15, 114. [Google Scholar] [CrossRef]
  33. Shahriar, S.M.; Bhuiyan, E.A.; Nahiduzzaman, M.; Ahsan, M.; Haider, J. State of Charge Estimation for Electric Vehicle Battery Management Systems Using the Hybrid Recurrent Learning Approach with Explainable Artificial Intelligence. Energies 2022, 15, 8003. [Google Scholar] [CrossRef]
  34. Mumtaz Noreen, M.T.; Fouladfar, M.H.; Saeed, N. Evaluation of Battery Management Systems for Electric Vehicles Using Traditional and Modern Estimation Methods. Network 2024, 4, 586–608. [Google Scholar] [CrossRef]
  35. Arévalo, P.; Ochoa-Correa, D.; Villa-Ávila, E. A Systematic Review on the Integration of Artificial Intelligence into Energy Management Systems for Electric Vehicles: Recent Advances and Future Perspectives. World Electr. Veh. J. 2024, 15, 364. [Google Scholar] [CrossRef]
  36. Ghazali, A.K.; Aziz, N.A.A.; Hassan, M.K. Advanced Algorithms in Battery Management Systems for Electric Vehicles: A Comprehensive Review. Symmetry 2025, 17, 321. [Google Scholar] [CrossRef]
  37. Challoob, A.F.; Bin Rahmat, N.A.; A/L Ramachandaramurthy, V.K.; Humaidi, A.J. Energy and Battery Management Systems for Electrical Vehicles: A Comprehensive Review & Recommendations. Energy Explor. Exploit. 2024, 42, 341–372. [Google Scholar] [CrossRef]
  38. Zhang, W.; Pranav, R.S.B.; Wang, R.; Lee, C.; Zeng, J.; Cho, M.; Shim, J. Lithium-Ion Battery Life Prediction Using Deep Transfer Learning. Batteries 2024, 10, 434. [Google Scholar] [CrossRef]
  39. Sravanthi C., L.; Chandra sekhar, J. Predicting Remaining Useful Life of Lithium-Ion Batteries for Electric Vehicles Using Machine Learning Regression Models. J. Eng. Technol. Ind. Appl. 2025, 11, 143–150. [Google Scholar] [CrossRef]
  40. Liu, L.; Huang, J.; Zhao, H.; Li, T.; Li, B. PatchFormer: A Novel Patch-Based Transformer for Accurate Remaining Useful Life Prediction of Lithium-Ion Batteries. J. Power Sources 2025, 631, 236187. [Google Scholar] [CrossRef]
  41. Nyamathulla, S.; Dhanamjayulu, C. A Review of Battery Energy Storage Systems and Advanced Battery Management System for Different Applications: Challenges and Recommendations. J. Energy Storage 2024, 86, 111179. [Google Scholar] [CrossRef]
  42. Khawaja, Y.; Shankar, N.; Qiqieh, I.; Alzubi, J.; Alzubi, O.; Nallakaruppan, M.; Padmanaban, S. Battery Management Solutions for Li-Ion Batteries Based on Artificial Intelligence. Ain Shams Eng. J. 2023, 14, 102213. [Google Scholar] [CrossRef]
  43. Valizadeh, A.; Amirhosseini, M.H. Machine Learning in Lithium-Ion Battery: Applications, Challenges, and Future Trends. SN Comput. Sci. 2024, 5, 717. [Google Scholar] [CrossRef]
  44. Niri, M.F.; Aslansefat, K.; Haghi, S.; Hashemian, M.; Daub, R.; Marco, J. A Review of the Applications of Explainable Machine Learning for Lithium–Ion Batteries: From Production to State and Performance Estimation. Energies 2023, 16, 6360. [Google Scholar] [CrossRef]
  45. Jin, S.; Sui, X.; Huang, X.; Wang, S.; Teodorescu, R.; Stroe, D.-I. Overview of Machine Learning Methods for Lithium-Ion Battery Remaining Useful Lifetime Prediction. Electronics 2021, 10, 3126. [Google Scholar] [CrossRef]
  46. Ardeshiri, R.R.; Balagopal, B.; Alsabbagh, A.; Ma, C.; Chow, M.-Y. Machine Learning Approaches in Battery Management Systems: State of the Art: Remaining Useful Life and Fault Detection. In Proceedings of the 2020 2nd IEEE International Conference on Industrial Electronics for Sustainable Energy Systems (IESES), Cagliari, Italy, 1–3 September 2020; pp. 61–66. [Google Scholar]
  47. Li, Z.; Cong, J.; Ding, Y.; Yang, Y.; Huang, K.; Ge, X.; Chen, K.; Zeng, T.; Huang, Z.; Fang, C.; et al. Strategies for Intelligent Detection and Fire Suppression of Lithium-Ion Batteries. Electrochem. Energy Rev. 2024, 7, 32. [Google Scholar] [CrossRef]
  48. Shabeer, Y.; Madani, S.S.; Panchal, S.; Fowler, M. Performance Optimization of High Energy Density Aluminum-Air Batteries: Effects of Operational Parameters and Electrolyte Composition. Future Batter. 2025, 6, 100082. [Google Scholar] [CrossRef]
  49. Amiri, M.N.; Håkansson, A.; Burheim, O.S.; Lamb, J.J. Lithium-Ion Battery Digitalization: Combining Physics-Based Models and Machine Learning. Renew. Sustain. Energy Rev. 2024, 200, 114577. [Google Scholar] [CrossRef]
  50. Cavus, M.; Dissanayake, D.; Energies, M.B. Next Generation of Electric Vehicles: AI-Driven Approaches for Predictive Maintenance and Battery Management. Energies 2025, 18, 1041. [Google Scholar] [CrossRef]
  51. Zhao, J.; Burke, A. Artificial Intelligence-Driven Electric Vehicle Battery Lifetime Diagnostics. In Vehicle Technology and Automotive Engineering [Working Title]; IntechOpen: Institute of Transportation Studies, University of California Davis: Davis, CA, USA, 2025. [Google Scholar]
  52. Chen, Q.; Lai, X.; Zhang, Y.; Chen, J.; Zheng, Y.; Song, X.; Han, X.; Ouyang, M. AI-Driven Sustainability Assessment for Greener Lithium-Ion Batteries. Carbon Footpr. 2025, 4, 12. [Google Scholar] [CrossRef]
  53. Mamidi, R.; Obulesu, D.; Prajna, B.K.; Dutta, S.R.; Rao, P.U.M.; Selvan, R.S.; Prabha, M. Enhancing Battery Health in Electric Vehicles: AI-Enhanced BMS for Accurate SoC, SoH, and Fault Diagnosis. Metall. Mater. Eng. 2025, 31, 491–500. [Google Scholar] [CrossRef]
  54. Mayemba, Q.; Ducret, G.; Li, A.; Mingant, R.; Venet, P. General Machine Learning Approaches for Lithium-Ion Battery Capacity Fade Compared to Empirical Models. Batteries 2024, 10, 367. [Google Scholar] [CrossRef]
  55. Wang, D.; Lee, J.; Kim, M.; Lee, I. Neural Network-Based State of Charge Estimation Method for Lithium-Ion Batteries Based on Temperature. Intell. Autom. Soft Comput. 2023, 36, 2025–2040. [Google Scholar] [CrossRef]
  56. Elkerdany, M.S.; Safwat, I.M.; Youssef, A.M.M.; Elkhatib, M.M. An Intelligent Energy Management System for Enhanced Performance in Electric UAVs. Aerosp. Syst. 2025, 1–16. [Google Scholar] [CrossRef]
  57. Zou, B.; Wang, H.; Zhang, T.; Xiong, M.; Xiong, C.; Sun, Q.; Wang, W.; Zhang, L.; Zhang, C.; Ruan, H. A Deep Learning Approach for State-of-Health Estimation of Lithium-Ion Batteries Based on Differential Thermal Voltammetry and Attention Mechanism. Front. Energy Res. 2023, 11, 1178151. [Google Scholar] [CrossRef]
  58. Zhang, Z.; Guo, T.; Liu, Y.; Pang, X.; Zheng, Z. Fast-Charging Optimization Method for Lithium-Ion Battery Packs Based on Deep Deterministic Policy Gradient Algorithm. Batteries 2025, 11, 199. [Google Scholar] [CrossRef]
  59. Issa, R.; Badr, M.M.; Shalash, O.; Othman, A.A.; Hamdan, E.; Hamad, M.S.; Abdel-Khalik, A.S.; Ahmed, S.; Imam, S.M. A Data-Driven Digital Twin of Electric Vehicle Li-Ion Battery State-of-Charge Estimation Enabled by Driving Behavior Application Programming Interfaces. Batteries 2023, 9, 521. [Google Scholar] [CrossRef]
  60. Iyer, N.G.; Ponnurangan, S.; Abdul Gafoor, N.A.; Rajendran, A.; Lashab, A.; Saha, D.; Guerrero, J.M. A Novel Heuristic Algorithm Integrating Battery Digital-Twin-Based State-of-Charge Estimation for Optimized Electric Vehicle Charging. Electronics 2024, 13, 4412. [Google Scholar] [CrossRef]
  61. Yang, D.; Cui, Y.; Xia, Q.; Jiang, F.; Ren, Y.; Sun, B.; Feng, Q.; Wang, Z.; Yang, C. A Digital Twin-Driven Life Prediction Method of Lithium-Ion Batteries Based on Adaptive Model Evolution. Materials 2022, 15, 3331. [Google Scholar] [CrossRef]
  62. Zhou, M.; Bai, L.; Lei, J.; Wang, Y.; Li, H. A Digital Twin Model for Battery Management Systems: Concepts, Algorithms, and Platforms. In Proceedings of the International Conference on Image, Vision and Intelligent Systems (ICIVIS 2021), Changsha, China, 15–17 June 2021; pp. 1165–1176. [Google Scholar]
  63. Kang, X.; Wang, Y.; Jiang, C.; Chen, Z. Digital Twin-Enhanced Control for Fuel Cell and Lithium-Ion Battery Hybrid Vehicles. Batteries 2024, 10, 242. [Google Scholar] [CrossRef]
  64. Tang, H.; Wu, Y.; Cai, Y.; Wang, F.; Lin, Z.; Pei, Y. Design of Power Lithium Battery Management System Based on Digital Twin. J. Energy Storage 2022, 47, 103679. [Google Scholar] [CrossRef]
  65. Song, X.; Yang, F.; Wang, D.; Tsui, K.-L. Combined CNN-LSTM Network for State-of-Charge Estimation of Lithium-Ion Batteries. IEEE Access 2019, 7, 88894–88902. [Google Scholar] [CrossRef]
  66. Khalid, A.; Sarwat, A.I. Unified Univariate-Neural Network Models for Lithium-Ion Battery State-of-Charge Forecasting Using Minimized Akaike Information Criterion Algorithm. IEEE Access 2021, 9, 39154–39170. [Google Scholar] [CrossRef]
  67. Das, O.; Zafar, M.H.; Sanfilippo, F.; Rudra, S.; Kolhe, M.L. Advancements in Digital Twin Technology and Machine Learning for Energy Systems: A Comprehensive Review of Applications in Smart Grids, Renewable Energy, and Electric Vehicle Optimisation. Energy Convers. Manag. X 2024, 24, 100715. [Google Scholar] [CrossRef]
  68. Kang, J. Software Practice and Experience on Smart Mobility Digital Twin in Transportation and Automotive Industry: Toward SDV-Empowered Digital Twin Through EV Edge-Cloud and AutoML. J. Web Eng. 2025, 23, 1155–1180. [Google Scholar] [CrossRef]
  69. Chai, X.; Yan, J.; Zhang, W.; Sulowicz, M.; Feng, Y. Recent Progress on Digital Twins in Intelligent Connected Vehicles: A Review. Elektron. Elektrotechnika 2024, 30, 4–17. [Google Scholar] [CrossRef]
  70. Wang, Y.; Kang, X.; Chen, Z. A Survey of Digital Twin Techniques in Smart Manufacturing and Management of Energy Applications. Green Energy Intell. Transp. 2022, 1, 100014. [Google Scholar] [CrossRef]
  71. Zhao, K.; Liu, Y.; Ming, W.; Zhou, Y.; Wu, J. Digital Twin-Driven Estimation of State of Charge for Li-Ion Battery. In Proceedings of the 2022 IEEE 7th International Energy Conference (ENERGYCON), Riga, Latvia, 9–12 May 2022. [Google Scholar]
  72. Lakshmi, M.S.; Sarma, V.A. Energy Storage System Using Digital Twins with AI and IoT for Efficient Energy Management and Prolonged Battery Life in Electric Vehicles. In Proceedings of the International Conference on Multi-Agent Systems for Collaborative Intelligence, ICMSCI 2025, Erode, India, 20–22 January 2025; pp. 177–184. [Google Scholar]
  73. Chaoui, H.; Golbon, N.; Hmouz, I.; Souissi, R.; Tahar, S. Lyapunov-Based Adaptive State of Charge and State of Health Estimation for Lithium-Ion Batteries. IEEE Trans. Ind. Electron. 2014, 62, 1610–1618. [Google Scholar] [CrossRef]
  74. Alamin, K.S.S.; Chen, Y.; Macii, E.; Poncino, M.; Vinco, S. Advancing Electric Vehicle Battery Management: A Data-Driven Digital Twin Approach for Real-Time Monitoring and Performance Enhancement. IEEE Trans. Veh. Technol. 2025, 1–15. [Google Scholar] [CrossRef]
  75. Nair, P.; Vakharia, V.; Shah, M.; Kumar, Y.; Woźniak, M.; Shafi, J.; Fazal Ijaz, M. AI-Driven Digital Twin Model for Reliable Lithium-Ion Battery Discharge Capacity Predictions. Int. J. Intell. Syst. 2024, 2024, 85044. [Google Scholar] [CrossRef]
  76. Darbari, A. Next-Generation Battery Solutions for EVs: The Power of Digital Twins. Glob. J. Eng. Technol. Adv. 2024, 21, 001–008. [Google Scholar] [CrossRef]
  77. Zhao, K.; Liu, Y.; Zhou, Y.; Ming, W.; Wu, J. A Hierarchical and Self-Evolving Digital Twin (HSE-DT) Method for Multi-Faceted Battery Situation Awareness Realisation. Machines 2025, 13, 175. [Google Scholar] [CrossRef]
  78. Jafari, S.; Byun, Y.-C. Prediction of the Battery State Using the Digital Twin Framework Based on the Battery Management System. IEEE Access 2022, 10, 124685–124696. [Google Scholar] [CrossRef]
  79. Pooyandeh, M.; Sohn, I. A Time-Stamp Attack on Digital Twin-Based Lithium-Ion Battery Monitoring for Electric Vehicles. In Proceedings of the 6th International Conference on Artificial Intelligence in Information and Communication, ICAIIC 2024, Osaka, Japan, 19–22 February 2024. [Google Scholar] [CrossRef]
  80. Bin Kaleem, M.; He, W.; Li, H. Machine Learning Driven Digital Twin Model of Li-Ion Batteries in Electric Vehicles: A Review. AI Auton. Syst. 2023, 1, 0003. [Google Scholar] [CrossRef]
  81. Bandara, T.R.; Halgamuge, M.N. Modeling a Digital Twin to Predict Battery Deterioration with Lower Prediction Error in Smart Devices: From the Internet of Things Sensor Devices to Self-Driving Cars. In Proceedings of the IECON 2022–48th Annual Conference of the IEEE Industrial Electronics Society, Brussels, Belgium, 17–20 October 2022. [Google Scholar] [CrossRef]
  82. Li, H.; Huang, J.; Ji, W.; He, Z.; Cheng, J.; Zhang, P.; Zhao, J. Predicting Capacity Fading Behaviors of Lithium Ion Batteries: An Electrochemical Protocol-Integrated Digital-Twin Solution. J. Electrochem. Soc. 2022, 169, 100504. [Google Scholar] [CrossRef]
  83. Ismail, M.; Ahmed, R. A Comprehensive Review of Cloud-Based Lithium-Ion Battery Management Systems for Electric Vehicle Applications. IEEE Access 2024, 12, 116259–116273. [Google Scholar] [CrossRef]
  84. Lu, L.; Han, X.; Li, J.; Hua, J.; Ouyang, M. A Review on the Key Issues for Lithium-Ion Battery Management in Electric Vehicles. J. Power Sources 2013, 226, 272–288. [Google Scholar] [CrossRef]
  85. Li, W.; Rentemeister, M.; Badeda, J.; Jöst, D.; Schulte, D.; Sauer, D.U. Digital Twin for Battery Systems: Cloud Battery Management System with Online State-of-Charge and State-of-Health Estimation. J. Energy Storage 2020, 30, 101557. [Google Scholar] [CrossRef]
  86. Adhikaree, A.; Kim, T.; Vagdoda, J.; Ochoa, A.; Hernandez, P.J.; Lee, Y. Cloud-Based Battery Condition Monitoring Platform for Large-Scale Lithium-Ion Battery Energy Storage Systems Using Internet-of-Things (IoT). In Proceedings of the 2017 IEEE Energy Conversion Congress and Exposition, ECCE, Cincinnati, OH, USA, 1–5 October 2017; pp. 1004–1009. [Google Scholar] [CrossRef]
  87. Shi, D.; Zhao, J.; Eze, C.; Wang, Z.; Wang, J.; Lian, Y.; Burke, A.F. Cloud-Based Artificial Intelligence Framework for Battery Management System. Energies 2023, 16, 4403. [Google Scholar] [CrossRef]
  88. Urgolo, A.; Sabathiel, S. Edge AI for Battery Management: Efficient SoC and SoH Prediction for Li-Ion Batteries with Neural Networks. In Proceedings of the European Conference on EDGE AI Technologies and Applications (EEAI 2024), Cagliari, Sardinia, Italy, 21–23 October 2024. accepted/in press. [Google Scholar]
  89. Mulpuri, S.K.; Sah, B.; Kumar, P. An Intelligent Battery Management System (BMS) with End-Edge-Cloud Connectivity—A Perspective. Sustain. Energy Fuels 2025, 9, 1142–1159. [Google Scholar] [CrossRef]
  90. Li, D.; Nan, J.; Burke, A.F.; Zhao, J. Battery Prognostics and Health Management: AI and Big Data. World Electr. Veh. J. 2025, 16, 10. [Google Scholar] [CrossRef]
  91. Zeeshan, M.; Akre, V. Cloud-Based Monitoring of Lithium-Ion Battery Management Systems for Health Estimation in Manufacturing Industries; SAE Technical Papers; SAE: Warrendale, PA, USA, 2025. [Google Scholar] [CrossRef]
  92. Tran, M.-K.; Panchal, S.; Khang, T.D.; Panchal, K.; Fraser, R.; Fowler, M. Concept Review of a Cloud-Based Smart Battery Management System for Lithium-Ion Batteries: Feasibility, Logistics, and Functionality. Batteries 2022, 8, 19. [Google Scholar] [CrossRef]
  93. Zhu, W.; Zhou, X.; Cao, M.; Wang, Y.; Zhang, T. The Cloud-End Collaboration Battery Management System with Accurate State-of-Charge Estimation for Large-Scale Lithium-Ion Battery System. In Proceedings of the 2022 8th International Conference on Big Data and Information Analytics (BigDIA), Guiyang, China, 24–25 August 2022; pp. 199–204. [Google Scholar] [CrossRef]
  94. Wu, D.; Xu, Z.; Wang, Q.; Jin, Z.; Xu, Y.; Wang, C.; He, X. A Brief Review of Key Technologies for Cloud-Based Battery Management Systems. J. Electron. Mater. 2024, 53, 7334–7354. [Google Scholar] [CrossRef]
  95. Liu, P.; Zhu, H.H.; Chen, J.; Li, G.Y. A Distributed Management System for Lithium Ion Battery Pack. In Proceedings of the 28th Chinese Control and Decision Conference, CCDC 2016, Yinchuan, China, 28–30 May 2016. [Google Scholar] [CrossRef]
  96. García, E.; Quiles, E.; Correcher, A. Distributed Intelligent Battery Management System Using a Real-World Cloud Computing System. Sensors 2023, 23, 3417. [Google Scholar] [CrossRef] [PubMed]
  97. Zhu, Z.W.; Qiu, J.Y.; Wang, L.; Cao, G.P.; He, X.M.; Wang, J.; Zhang, H. Application of Artificial Intelligence to Lithium-Ion Battery Research and Development. J. Electrochem. 2022, 28, 1. [Google Scholar] [CrossRef]
  98. Samanta, A.; Sharma, M.; Locke, W.; Williamson, S. Cloud-Enhanced Battery Management System Architecture for Real-Time Data Visualization, Decision Making, and Long-Term Storage. IEEE J. Emerg. Sel. Top. Ind. Electron. 2025, 1–12. [Google Scholar] [CrossRef]
  99. Li, S.; He, H.; Wei, Z.; Zhao, P. Edge Computing for Vehicle Battery Management: Cloud-Based Online State Estimation. J. Energy Storage 2022, 55, 105502. [Google Scholar] [CrossRef]
  100. Felix Omojola, A.; Omata Ilabija, C.; Ifeanyi Onyeka, C.; Ishiwu, J.I.; Olaleye, T.G.; Juliet Ozoemena, I.; Nzereogu, P.U. Artificial Intelligence-Driven Strategies for Advancing Lithium-Ion Battery Performance and Safety. Int. J. Adv. Eng. Manag. 2024, 6, 452–484. [Google Scholar] [CrossRef]
  101. Di Rienzo, R.; Nicodemo, N.; Roncella, R.; Saletti, R.; Vennettilli, N.; Asaro, S.; Tola, R.; Baronti, F. Cloud-Based Optimization of a Battery Model Parameter Identification Algorithm for Battery State-of-Health Estimation in Electric Vehicles. Batteries 2023, 9, 486. [Google Scholar] [CrossRef]
  102. Yeregui, J.; Oca, L.; Lopetegi, I.; Garayalde, E.; Aizpurua, M.; Iraola, U. State of Charge Estimation Combining Physics-Based and Artificial Intelligence Models for Lithium-Ion Batteries. J. Energy Storage 2023, 73, 108883. [Google Scholar] [CrossRef]
  103. Tejaswini, P.; Sivraj, P. Artificial Intelligence Based State of Charge Estimation of Li-Ion Battery for EV Applications. In Proceedings of the 2020 5th International Conference on Communication and Electronics Systems (ICCES), Coimbatore, India, 10–12 June 2020. [Google Scholar]
  104. Pollo, G.; Burrello, A.; Macii, E.; Poncino, M.; Vinco, S.; Pagliari, D.J. Coupling Neural Networks and Physics Equations For Li-Ion Battery State-of-Charge Prediction. arXiv 2024. [Google Scholar] [CrossRef]
  105. El Fallah, S.; Kharbach, J.; Vanagas, J.; Vilkelytė, Ž.; Tolvaišienė, S.; Gudžius, S.; Kalvaitis, A.; Lehmam, O.; Masrour, R.; Hammouch, Z.; et al. Advanced State of Charge Estimation Using Deep Neural Network, Gated Recurrent Unit, and Long Short-Term Memory Models for Lithium-Ion Batteries under Aging and Temperature Conditions. Appl. Sci. 2024, 14, 6648. [Google Scholar] [CrossRef]
  106. Zhang, L.; Song, Q.; Li, J. SOC Estimation for Lithium-Ion Batteries Based on the AEKF-SVM Algorithm. Acad. J. Comput. Inf. Sci. 2024, 7, 190–198. [Google Scholar] [CrossRef]
  107. Xie, J.; Wei, X.; Bo, X.; Zhang, P.; Chen, P.; Hao, W.; Yuan, M. State of Charge Estimation of Lithium-Ion Battery Based on Extended Kalman Filter Algorithm. Front. Energy Res. 2023, 11, 881. [Google Scholar] [CrossRef]
  108. Zhao, J.; Ge, M.; Huang, Q.; Zhong, X. Lithium Battery SOC Estimation Based on Type-2 Fuzzy Cerebellar Model Neural Network. Electronics 2024, 13, 4999. [Google Scholar] [CrossRef]
  109. Andújar, J.M.; Barragán, A.J.; Vivas, F.J.; Enrique, J.M.; Segura, F. Iterative Nonlinear Fuzzy Modeling of Lithium-Ion Batteries. Batteries 2023, 9, 100. [Google Scholar] [CrossRef]
  110. Korkmaz, M. SoC Estimation of Lithium-Ion Batteries Based on Machine Learning Techniques: A Filtered Approach. J. Energy Storage 2023, 72, 108268. [Google Scholar] [CrossRef]
  111. Obuli Pranav, D.; Babu, P.S.; Indragandhi, V.; Ashok, B.; Vedhanayaki, S.; Kavitha, C. Enhanced SOC Estimation of Lithium Ion Batteries with RealTime Data Using Machine Learning Algorithms. Sci. Rep. 2024, 14, 16036. [Google Scholar] [CrossRef]
  112. Cervellieri, A. Advanced SOC Prediction for Lithium-Ion Batteries Using FNN Machine Learning Techniques: A Bayesian Regularization Training Approach. J. Electr. Syst. 2024, 20, 6586–6596. [Google Scholar]
  113. Dineva, A. Advanced Machine Learning Approaches for State-of-Charge Prediction of Li-Ion Batteries under Multisine Excitation. In Proceedings of the 2021 17th Conference on Electrical Machines, Drives and Power Systems, ELMA 2021, Sofia, Bulgaria, 1–4 July 2021. [Google Scholar] [CrossRef]
  114. Zou, M.; Wang, J.; Yan, D.; Li, Y.; Tang, X. Novel SOC Estimation For Lithium–Ion Batteries in Containerized Energy Storage System Based on A–Cnn–Lstm Network. SSRN 2024. [Google Scholar] [CrossRef]
  115. Li, Y.; Qin, X.; Ma, F.; Wu, H.; Chai, M.; Zhang, F.; Jiang, F.; Lei, X. Fusion Technology-Based CNN-LSTM-ASAN for RUL Estimation of Lithium-Ion Batteries. Sustainability 2024, 16, 9233. [Google Scholar] [CrossRef]
  116. Ofoegbu, E.O. State of Charge (SOC) Estimation in Electric Vehicle (EV) Battery Management Systems Using Ensemble Methods and Neural Networks. J. Energy Storage 2025, 114, 115833. [Google Scholar] [CrossRef]
  117. Zhao, B.; Zhang, W.; Zhang, Y.; Zhang, C.; Zhang, C.; Zhang, J. Lithium-Ion Battery Remaining Useful Life Prediction Based on Interpretable Deep Learning and Network Parameter Optimization. Appl. Energy 2025, 379, 124713. [Google Scholar] [CrossRef]
  118. Zou, C.; Chen, X.; Zhang, Y. State of Health Prediction of Lithium-Ion Batteries Based on Temporal Degeneration Feature Extraction with Deep Cycle Attention Network. J. Energy Storage 2023, 65, 107367. [Google Scholar] [CrossRef]
  119. Pham, T.; Bui, H.; Nguyen, M.; Pham, Q.; Vu, V.; Le, T.; Quan, T. A Comprehensive Review on Data-Driven Methods of Lithium-Ion Batteries State-of-Health Forecasting. Wiley Interdiscip. Rev. Data Min. Knowl. Discov. 2025, 15, e70009. [Google Scholar] [CrossRef]
  120. Lanubile, A.; Bosoni, P.; Pozzato, G.; Allam, A.; Acquarone, M.; Onori, S. Domain Knowledge-Guided Machine Learning Framework for State of Health Estimation in Lithium-Ion Batteries. Commun. Eng. 2024, 3, 168. [Google Scholar] [CrossRef]
  121. Chen, Z.; Peng, Y.; Shen, J.; Zhang, Q.; Liu, Y.; Zhang, Y.; Xia, X.; Liu, Y. State of Health Estimation for Lithium-Ion Batteries Based on Fragmented Charging Data and Improved Gated Recurrent Unit Neural Network. J. Energy Storage 2025, 115, 115952. [Google Scholar] [CrossRef]
  122. Zhao, F.M.; Gao, D.X.; Cheng, Y.M.; Yang, Q. Application of State of Health Estimation and Remaining Useful Life Prediction for Lithium-Ion Batteries Based on AT-CNN-BiLSTM. Sci. Rep. 2024, 14, 29026. [Google Scholar] [CrossRef]
  123. Pohlmann, S.; Mashayekh, A.; Stroebl, F.; Karnehm, D.; Kuder, M.; Neve, A.; Weyh, T. State-of-Health Prediction of Lithium-Ion Batteries Based on a Low Dimensional Gaussian Process Regression. J. Energy Storage 2024, 88, 111649. [Google Scholar] [CrossRef]
  124. Liu, C.; Li, H.; Li, K.; Wu, Y.; Lv, B. Deep Learning for State of Health Estimation of Lithium-Ion Batteries in Electric Vehicles: A Systematic Review. Energies 2025, 18, 1463. [Google Scholar] [CrossRef]
  125. Anitha, K.; Szymański, J.R.; Zurek-Mortka, M.; Sathiyanarayanan, M. Data-Driven Approach Using Random Forest Regression for Accurate Electric Vehicle Battery State of Health Estimation. SSRG Int. J. Electr. Electron. Eng. 2025, 12, 104–113. [Google Scholar] [CrossRef]
  126. Tao, S.; Ma, R.; Zhao, Z.; Ma, G.; Su, L.; Chang, H.; Chen, Y.; Liu, H.; Liang, Z.; Cao, T.; et al. Generative Learning Assisted State-of-Health Estimation for Sustainable Battery Recycling with Random Retirement Conditions. Nat. Commun. 2024, 15, 10154. [Google Scholar] [CrossRef]
  127. Jiang, P.; Zhang, T.; Huang, G.; Hua, W.; Zhang, Y.; Wang, W.; Zhu, T. An End-Cloud Collaboration Approach for State-of-Health Estimation of Lithium-Ion Batteries Based on Bi-LSTM with Collaboration of Multi-Feature and Attention Mechanism. Int. J. Green Energy 2024, 21, 2205–2217. [Google Scholar] [CrossRef]
  128. Hoque, M.A.; Hassan, M.K.; Hajjo, A.; Tokhi, M.O. Neural Network-Based Li-Ion Battery Aging Model at Accelerated C-Rate. Batteries 2023, 9, 93. [Google Scholar] [CrossRef]
  129. Marri, I.; Petkovski, E.; Cristaldi, L.; Faifer, M. Comparing Machine Learning Strategies for SoH Estimation of Lithium-Ion Batteries Using a Feature-Based Approach. Energies 2023, 16, 4423. [Google Scholar] [CrossRef]
  130. Murgai, S.; Bhagwat, H.; Dandekar, R.A.; Dandekar, R.; Panat, S. A Scientific Machine Learning Approach for Predicting and Forecasting Battery Degradation in Electric Vehicles. arXiv 2024. [Google Scholar] [CrossRef]
  131. Wang, Y.; Wang, Y.; Li, H.; Zhang, R.; Plett, G.L.; Ryan, C.; Nelson, M. Machine Learning-Assisted Instant State of Health Estimation of Lithium-Ion Battery. SSRN 2024. [Google Scholar] [CrossRef]
  132. Qi, N.; Yan, K.; Yu, Y.; Li, R.; Huang, R.; Chen, L.; Su, Y. Machine Learning and Neural Network Supported State of Health Simulation and Forecasting Model for Lithium-Ion Battery. Front. Energy 2023, 18, 223–240. [Google Scholar] [CrossRef]
  133. Ortiz, Y.; Arévalo, P.; Peña, D.; Jurado, F. Recent Advances in Thermal Management Strategies for Lithium-Ion Batteries: A Comprehensive Review. Batteries 2024, 10, 83. [Google Scholar] [CrossRef]
  134. Kurucan, M.; Özbaltan, M.; Yetgin, Z.; Alkaya, A. Applications of Artificial Neural Network Based Battery Management Systems: A Literature Review. Renew. Sustain. Energy Rev. 2024, 192, 114262. [Google Scholar] [CrossRef]
  135. Li, A.; Weng, J.; Yuen, A.C.Y.; Wang, W.; Liu, H.; Lee, E.W.M.; Wang, J.; Kook, S.; Yeoh, G.H. Machine Learning Assisted Advanced Battery Thermal Management System: A State-of-the-Art Review. J. Energy Storage 2023, 60, 106688. [Google Scholar] [CrossRef]
  136. Hui, L.; Weixiao, J.; Yuliang, C.; Hui, Z.; Hanxi, Y.; Xinping, A. Thermal Runaway-Preventing Technologies for Lithium-Ion Batteries. Energy Storage Sci. Technol. 2018, 7, 376. [Google Scholar] [CrossRef]
  137. McKerracher, R.D.; Guzman-Guemez, J.; Wills, R.G.A.; Sharkh, S.M.; Kramer, D. Advances in Prevention of Thermal Runaway in Lithium-Ion Batteries. Adv. Energy Sustain. Res. 2021, 2, 2000059. [Google Scholar] [CrossRef]
  138. Yetik, O.; Yilmaz, N.; Karakoc, T.H. Computational Modeling of a Lithium-Ion Battery Thermal Management System with Al2O3-Based Nanofluids. Int. J. Energy Res. 2021, 45, 13851–13864. [Google Scholar] [CrossRef]
  139. Kumaresan, K.; Sikha, G.; White, R.E. Scholar Commons Scholar Commons Thermal Model for a Li-Ion Cell Thermal Model for a Li-Ion Cell. J. Electrochem. Soc. 2008, 155, 164–171. [Google Scholar] [CrossRef]
  140. Mahadevan, V.; Vikraman, B.P. LSTM Based Battery Management Systems and FOPID Based Cooling System Strategy for Electric Vehicles Application. J. Energy Storage 2025, 116, 115937. [Google Scholar] [CrossRef]
  141. Lyu, P.; An, Z.; Li, M.; Liu, X.; Feng, X.; Rao, Z. Artificial Intelligence Algorithms Optimize Immersion Boiling Heat Transfer Strategies to Mitigate Thermal Runaway of Lithium-Ion Batteries. eTransportation 2025, 24, 100395. [Google Scholar] [CrossRef]
  142. Ramasamy, D.; Subramaniam, S.; Bin Ezani, M.D.Z. Prediction of Battery Thermal Management System Performance by AI Method. Nexus Sustain. Energy Technol. J. 2025, 1, 7–17. [Google Scholar] [CrossRef]
  143. Zhuang, W.; Liu, Z.; Su, H.; Chen, G. An Intelligent Thermal Management System for Optimized Lithium-Ion Battery Pack. Appl. Therm. Eng. 2021, 189, 116767. [Google Scholar] [CrossRef]
  144. Briggt, M.; Johansson, G. Optimizing Electric Vehicle Performance through Advanced AI-Driven Battery Management Systems (BMS). 2023. Available online: https://ssrn.com/abstract=5232461 (accessed on 28 December 2023). [CrossRef]
  145. Song, I.T.; Kang, J.; Koh, J.; Choi, H.; Yang, H.; Park, E.; Lee, J.; Cho, W.; Lee, Y.M.; Lee, S.; et al. Thermal Runaway Prevention through Scalable Fabrication of Safety Reinforced Layer in Practical Li-Ion Batteries. Nat. Commun. 2024, 15, 8294. [Google Scholar] [CrossRef]
  146. Xu, J.; Lan, C.; Qiao, Y.; Ma, Y. Prevent Thermal Runaway of Lithium-Ion Batteries with Minichannel Cooling. Appl. Therm. Eng. 2017, 110, 883–890. [Google Scholar] [CrossRef]
  147. Ye, W.B.; Arıcı, M. Numerical Thermal Control Design for Applicability to a Large-Scale High-Capacity Lithium-Ion Energy Storage System Subjected to Forced Cooling. Numeri. Heat Transf. Part A Appl. 2024, 1–15. [Google Scholar] [CrossRef]
  148. Zhang, Z.; Fu, L.; Sheng, L.; Sun, Y. Method of Liquid-Cooled Thermal Control to a Large-Scale Pouch Lithium-Ion Battery. Appl. Therm. Eng. 2022, 211, 118417. [Google Scholar] [CrossRef]
  149. Qi, S.; Cheng, Y.; Li, Z.; Wang, J.; Li, H.; Zhang, C. Advanced Deep Learning Techniques for Battery Thermal Management in New Energy Vehicles. Energies 2024, 17, 4132. [Google Scholar] [CrossRef]
  150. Bacak, A. Investigation of Maximum Temperatures in Lithium-Ion Batteries by CFD and Machine Learning. Proc. Inst. Mech. Eng. Part D J. Automob. Eng. 2024, 239, 2461–2471. [Google Scholar] [CrossRef]
  151. Rinaldi, S.; Bridgelall, R. Classifying Invention Objectives of Electric Vehicle Chargers through Natural Language Processing and Machine Learning. Inventions 2023, 8, 149. [Google Scholar] [CrossRef]
  152. Birkl, C.R.; Roberts, M.R.; McTurk, E.; Bruce, P.G.; Howey, D.A. Degradation Diagnostics for Lithium Ion Cells. J. Power Sources 2017, 341, 373–386. [Google Scholar] [CrossRef]
  153. Using Battery Digital Twins to Elevate EV Performance and Life-Industry Articles. Available online: https://eepower.com/industry-articles/using-battery-digital-twins-to-elevate-ev-performance-and-life/# (accessed on 16 July 2025).
  154. 5 Ways BYD Is Using AI [Case Study] [2025]-DigitalDefynd. Available online: https://digitaldefynd.com/IQ/byd-using-ai-case-study/ (accessed on 16 July 2025).
Figure 1. Applications and technological enablers of Battery Management Systems. (a) General BMS applications across various industries such as consumer electronics, electric vehicles, and energy storage. (b) Key components and advancements in modern BMS, including hybrid algorithms, state monitoring, and integration with big data platforms.
Figure 1. Applications and technological enablers of Battery Management Systems. (a) General BMS applications across various industries such as consumer electronics, electric vehicles, and energy storage. (b) Key components and advancements in modern BMS, including hybrid algorithms, state monitoring, and integration with big data platforms.
Batteries 11 00298 g001
Figure 2. Use of on-board battery data, analysis techniques, and output information to support AI-driven diagnostics and health estimation in lithium-ion batteries.
Figure 2. Use of on-board battery data, analysis techniques, and output information to support AI-driven diagnostics and health estimation in lithium-ion batteries.
Batteries 11 00298 g002
Figure 3. Standardized workflow for developing battery digital twins. The figure illustrates an end-to-end integration of data collection, digital representation, storage, predictive analytics, simulation, visualization, and feedback mechanisms for optimized battery management.
Figure 3. Standardized workflow for developing battery digital twins. The figure illustrates an end-to-end integration of data collection, digital representation, storage, predictive analytics, simulation, visualization, and feedback mechanisms for optimized battery management.
Batteries 11 00298 g003
Figure 4. Digital twin modeling framework for lithium-ion battery management across the design, manufacturing, and operational phases, enabling intelligent control, predictive maintenance, and lifecycle optimization.
Figure 4. Digital twin modeling framework for lithium-ion battery management across the design, manufacturing, and operational phases, enabling intelligent control, predictive maintenance, and lifecycle optimization.
Batteries 11 00298 g004
Figure 5. Digital twin-based architecture for Battery Management Systems (BMS), illustrating how real-time battery data flow through multiple layers for monitoring, analysis, and optimization.
Figure 5. Digital twin-based architecture for Battery Management Systems (BMS), illustrating how real-time battery data flow through multiple layers for monitoring, analysis, and optimization.
Batteries 11 00298 g005
Figure 6. Edge-cloud and digital twin-based architecture for Battery Management Systems (BMS). The system enables real-time battery monitoring, analysis, and control through local processing, cloud computing, and virtual modeling for smart energy applications.
Figure 6. Edge-cloud and digital twin-based architecture for Battery Management Systems (BMS). The system enables real-time battery monitoring, analysis, and control through local processing, cloud computing, and virtual modeling for smart energy applications.
Batteries 11 00298 g006
Figure 7. Cloud-based BMS architecture integrating IaaS, PaaS, SaaS, and DaaS for real-time battery monitoring, analytics, and user interaction in electric vehicles.
Figure 7. Cloud-based BMS architecture integrating IaaS, PaaS, SaaS, and DaaS for real-time battery monitoring, analytics, and user interaction in electric vehicles.
Batteries 11 00298 g007
Figure 8. Architecture of the Cyber-Physical Battery Management System (CP-BMS).
Figure 8. Architecture of the Cyber-Physical Battery Management System (CP-BMS).
Batteries 11 00298 g008
Table 1. Summary of AI Models and Applications in Lithium-Ion Battery Systems.
Table 1. Summary of AI Models and Applications in Lithium-Ion Battery Systems.
Model/AlgorithmApplication AreaKey ContributionsReference
TS-GAN (Time-Series GAN)AI-empowered DT for SOC/SOH estimationSynthesized accurate SOC profiles; improved energy efficiency, operational safety, and lifespan in BMS[18]
X-ray CT and EIS-integrated AI BMSSOC, SOH, fault detectionModel-free predictions; 40% improved fault detection sensitivity; reduced thermal risk[19]
AI-PF (Particle Filter with adaptive mutation)SOH, RUL prediction33% RMSE reduction; RUL prediction error ~4.94%; real-time suitability[21]
XGBoost, LightGBM, SVMCapacity prediction under variable loadEnsemble models for predictive power in dynamic conditions[22]
Q-LearningBMS optimization for EVsImproved energy efficiency by 15%, battery lifespan by 20%[23]
Hybrid AI Models (ANNs, Fuzzy Logic)Battery health and power quality estimationHigh-accuracy health estimation; identified DL/RL potential for adaptive control[25]
Supervised ML Techniques (Classification/Regression)Real-time BMS optimizationEnabled IoT/cloud-edge integration for adaptive management[26]
AI Tools for Material DiscoveryElectrode/electrolyte designAccelerated high-energy-density material discovery, recycling optimization[27]
LightGBM with SHAP (Explainable AI)Capacity prediction and feature analysisMAE: 0.103, MSE: 0.019, R2: 0.887; identified temperature as key degradation driver[28]
Data-driven workflow (ML and text mining)Material optimizationEnabled systematic electrolyte design; addressed data scarcity[29]
Bi-LSTM with GRO (Gold Rush Optimizer)SOH and capacity fade predictionOutperformed LSTM and metaheuristics; enhanced global-local parameter tuning[32]
Hybrid CNN-GRU-LSTMSOC estimationError as low as 0.41%; fast computation (0.000113 s/sample) for onboard use[33]
UKF (Unscented Kalman Filter)SOC, SOH estimation<1% SOC error under dynamic EV conditions; cloud-integrated tracking[34]
AI-integrated EMSPredictive maintenance and range extensionImproved predictive maintenance, adaptive routing; 12–18% range extension[35]
Ensemble ML and lightweight MLSOP, SOE, safety monitoringIdentified gaps in real-time safety management for BMS[36]
Bidirectional Converter with Hybrid PowerLi-ion and supercapacitor hybrid systemsImproved thermal stability, resistance estimation, longevity[37]
VGG16-based Deep Transfer LearningRUL predictionModeled capacity degradation with voltage/capacity data; highlighted feature integration needs[38]
Bagging RegressorRUL predictionR2: 0.999; MAE: 2.099; outperformed GB, KNN, ET[39]
PatchFormer (Patch-based Transformer)RUL predictionDual attention networks for global/temporal degradation tracking; high-fidelity forecasting[40]
Random Forest RegressionSOC, SOH estimationR2: 0.9999; MAE: 0.0035; effective for small datasets[42]
DNN (Deep Neural Networks)RUL predictionHigh generalization for nonlinear degradation; predictive maintenance enablement[46]
XML (Explainable Machine Learning)Battery state/performance monitoringAdvanced explainability in battery ML pipelines; need for further development[45]
Temperature-Adaptive NN (GRU, LSTM, MNN)SOC estimation under thermal variationMAE: 2.15%; 50% error reduction vs. traditional; edge-deployable[55]
Extreme Learning Machine-Inspired NetworksCapacity fade predictionRMSE: 1.3–2.7%; effective across thermal/aging profiles[54]
Bi-LSTM with Dual Attention + DTVSOH estimationRMSE: 0.4%, MAE: 0.3%; ~10% accuracy improvement; robust across cycles[57]
DDPG-enhanced AlgorithmFast charging optimizationImproved efficiency/lifespan in simulations[58]
Ensemble Learning + Cloud DTSOC estimation for EV chargingNRMSE: 0.00047; future DL integration potential[59]
ENDEAVOR (DT + UKF)SOC estimation and charging optimizationReduced grid load, user wait times, efficient node selection[60]
Stochastic Modeling + Bayesian EvolutionReliability-focused DT for EVs<5% error in life prediction; >50% cost savings[61]
Table 2. AI-Enabled Digital Twin Models for Lithium-Ion Battery Management.
Table 2. AI-Enabled Digital Twin Models for Lithium-Ion Battery Management.
Model/AlgorithmApplication AreaKey Contributions Reference
EKF + PSO hybrid systemSOC and SOH estimation in digital twinsDemonstrated effective SOC/SOH estimation under Gaussian noise; operational viability in embedded platforms[62]
Hybrid HIF-PF algorithmSOC estimation under dynamic conditionsAchieved SOC estimation error of 0.14%; addressed storage, sensitivity, and computational issues[64]
Hybrid CNN-LSTMSOC estimation across thermal and nonlinear dynamicsMAE < 1.5%; robust performance across temperatures; adaptive updates for aging[65]
m-AIC optimized ARIMA + NARX + MLPSOC forecasting under low C-ratesRMSE as low as 0.1323%; supports future adaptive observer integration[66]
LSTM + EKF hybridSOC estimation with online correctionImproved robustness, lower error margins, adaptable learning for SOC/RUL[70]
Improved Gray Wolf Optimization + AdaBoostDischarge capacity predictionMAE and RMSE as low as 0.01; outperformed LSTM models[71]
Stochastic degradation modeling + Bayesian adaptive evolutionRUL estimation and predictive maintenance5% prediction error; 62% cost reduction in maintenance[61]
Hierarchical Self-Evolving DT (Transformer + CNN + Transfer Learning)SOC and SOH estimationRMSE < 0.9% (SOC) and <0.8% (SOH); high precision and responsiveness[77]
XGBoost + EKF hybridOnline aging tracking and SOC estimationImproved robustness in aging tracking with adaptive filtering[78]
Supervised voting ensemble ML (Azure IIoT)SOC estimation under dynamic drivingNRMSE 1.1446 (sim), 0.02385 (exp); robust EV range prediction[59]
Adaptive LSTM with Adam optimizationCapacity estimation with hybrid DT68.42% reduction in capacity estimation error vs. conventional DT[81]
Numerical electrochemical simulation + ML augmentationCapacity prediction with reduced experimentation<2% capacity prediction error; reduced experimental load by 99%[82]
Table 3. Cloud-Based and Edge AI Solutions for Lithium-Ion Battery Management.
Table 3. Cloud-Based and Edge AI Solutions for Lithium-Ion Battery Management.
Model/ArchitectureApplication AreaKey Contributions/OutcomesReference
Adaptive Extended H-infinity Filter + PSO hybridSOC estimation and capacity/power fade trackingRobust accuracy across chemistries under dynamic loads; operational viability in stationary/mobile deployments[85]
IoT-enabled cloud condition monitoring platforms (Raspberry Pi, Google Cloud)Health assessment of large-scale Li-ion systemsCost-effective, scalable, accurate health monitoring[86]
AI-powered cloud BMS frameworksState estimation, thermal management, predictive maintenanceMulti-timescale predictions; enhances reliability and safety without costly hardware upgrades[87]
Compact NN with EIS + temperature dataReal-time SOC/SOH estimation on edge devicesLow memory/processing; edge deployment feasibility for predictive maintenance[88]
IBMS (end-edge-cloud + DT + blockchain)Predictive analytics and adaptive controlSecure, multilayered predictive system for EV/energy networks[89]
GRU + transfer learning hybridSOC estimation in cloud-end collaborative BMSComputationally efficient, scalable for renewables and EVs[93]
Distributed BMS (STM32-based)Decentralized real-time monitoringHigh measurement accuracy; stable cell balancing and protection[95]
Decentralized smart BMS + Petri Net modelingIntelligent energy optimization and remote diagnosticsEffective off-grid deployment; enhanced safety and lifetime[96]
Moving Window Least Squares (MWLS) in cloudECM parameter estimation for EV BMSLab-comparable voltage prediction in field; degradation tracking[101]
CEBMS (cloud deep learning + edge predictive models)Real-time SOC/SOH in EVsImproved accuracy vs. onboard systems; scalable fleet-wide[99]
Neural networks, RL, federated learning in cloudSOC/SOH estimation, charging optimizationEnhanced diagnostics, privacy-preserving federated learning, adaptive RL[100]
Table 4. AI-Enabled Thermal Control and Management in Lithium-Ion Batteries.
Table 4. AI-Enabled Thermal Control and Management in Lithium-Ion Batteries.
Model/TechnologyApplication AreaKey Contributions Reference
MSCC-DRL (Deep Reinforcement Learning)Real-time thermal optimization under extreme conditionsDynamic adaptability; supports ultra-fast charging; maintains energy density[133]
Artificial Neural Networks (ANNs: Feedforward, CNN, RNN)Prediction of SOC, SOH, RUL; fault detectionAccurate thermal-electrochemical state prediction; enhanced safety event anticipation[134]
ML-driven BTMS with digital twinsThermal distribution, gradient prediction, fire risk controlReal-time predictive diagnostics; adaptive control for EV and stationary systems[135]
Material-level safety mechanisms (PTC electrodes, thermoresponsive coatings)Intrinsic thermal runaway preventionInterrupts ion/electron flow at critical temps; reduces reliance on external controls[136]
AI-enhanced mist-based cooling and hybrid electrolytesExternal thermal mitigationEfficient heat dissipation; self-healing polymers improve resilience[137]
Nanofluid-based coolant thermal managementHigh-density hybrid electric aircraft batteriesSuperior heat dissipation; reduced capacity degradation at varied conditions[138]
Thermal-electrochemical modeling (temp-dependent parameters)Discharge behavior predictionAccurate thermal-electrochemical interaction simulation; validated with pouch cells[139]
Hybrid LSTM + FOPID BTMSBattery temperature control in EVs98% accuracy; adaptive heating/cooling; extends battery life under dynamic loads[140]
AI optimization of boiling heat transfer in immersion coolingThermal runaway mitigationIdentifies key coolant properties; validates immersion cooling viability[141]
Random Forest and Gradient Boosting modelsPredicting thermal performance in cylindrical cellsHighlights battery spacing impact; guides efficient BTMS design[142]
AI + fuzzy MPC with airflow optimizationThermal uniformity and energy efficiency in EV battery packs76.4% energy reduction; temp non-uniformity reduced from 1.5 °C to 0.6 °C[143]
Deep learning for SOC and fault detection in BMSDynamic thermal managementEarly fault detection; supports adaptive charging; improves longevity[144]
Safety Reinforced Layer (SRL) with polythiophene/carbonEdge AI for thermal runaway preventionReduces explosion rates (63% to 10%); scalable fabrication; effective in impact tests[145]
Mini-channel cooling systemsPrevent thermal propagation across cellsLimits cascade failure; influenced by coolant flow and abuse kinetics[146]
Forced-air cooling design for large ESS cabinetsLarge-scale ESS thermal uniformityMaintains temp rise <4.63 K; uniformity <2.82 K; compact design validated experimentally[147]
Liquid cooling with streamlined channels for pouch cellsCooling efficiency in large-format batteriesMass flow rate and glycol content critical; trade-offs between cooling and pressure drop[148]
Neural networks (CNN, RNN, Transformer) + attention and GANsReal-time temperature prediction and regulationEnhanced spatial and temporal modeling; challenges in efficiency and data needs[149]
Hybrid CFD + ANN (Levenberg-Marquardt optimization)Battery surface temperature predictionHigh accuracy (MSE = 0.00552, R2 = 0.99); efficient alternative to CFD simulations[150]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Madani, S.S.; Shabeer, Y.; Fowler, M.; Panchal, S.; Chaoui, H.; Mekhilef, S.; Dou, S.X.; See, K. Artificial Intelligence and Digital Twin Technologies for Intelligent Lithium-Ion Battery Management Systems: A Comprehensive Review of State Estimation, Lifecycle Optimization, and Cloud-Edge Integration. Batteries 2025, 11, 298. https://doi.org/10.3390/batteries11080298

AMA Style

Madani SS, Shabeer Y, Fowler M, Panchal S, Chaoui H, Mekhilef S, Dou SX, See K. Artificial Intelligence and Digital Twin Technologies for Intelligent Lithium-Ion Battery Management Systems: A Comprehensive Review of State Estimation, Lifecycle Optimization, and Cloud-Edge Integration. Batteries. 2025; 11(8):298. https://doi.org/10.3390/batteries11080298

Chicago/Turabian Style

Madani, Seyed Saeed, Yasmin Shabeer, Michael Fowler, Satyam Panchal, Hicham Chaoui, Saad Mekhilef, Shi Xue Dou, and Khay See. 2025. "Artificial Intelligence and Digital Twin Technologies for Intelligent Lithium-Ion Battery Management Systems: A Comprehensive Review of State Estimation, Lifecycle Optimization, and Cloud-Edge Integration" Batteries 11, no. 8: 298. https://doi.org/10.3390/batteries11080298

APA Style

Madani, S. S., Shabeer, Y., Fowler, M., Panchal, S., Chaoui, H., Mekhilef, S., Dou, S. X., & See, K. (2025). Artificial Intelligence and Digital Twin Technologies for Intelligent Lithium-Ion Battery Management Systems: A Comprehensive Review of State Estimation, Lifecycle Optimization, and Cloud-Edge Integration. Batteries, 11(8), 298. https://doi.org/10.3390/batteries11080298

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop