1. Introduction
Global climate change and the rapid increase in greenhouse gas emissions pose critical challenges to sustainable development [
1,
2,
3]. The transportation sector is a major contributor to carbon emissions, prompting a global transition toward cleaner energy solutions [
4,
5,
6]. In this context, electric vehicles (EVs) have emerged as a promising alternative to conventional internal combustion engine vehicles, with the potential to significantly reduce emissions and improve energy efficiency [
7,
8,
9]. As the core energy storage component of EVs, lithium-ion batteries (LIBs) have gained widespread adoption owing to their high energy density, long cycle life, and favourable electrochemical performance [
10,
11,
12,
13,
14].
Despite these advantages, LIBs are highly sensitive to temperature variations during operation, particularly under high load conditions such as fast charging and discharging [
15,
16,
17,
18]. To ensure peak performance and safety, LIBs require stable operation within an optimal temperature range of 25–40 °C, with ΔT maintained below 5 °C [
19,
20,
21,
22]. Excessive temperature rise can lead to performance degradation, reduced lifespan, and even safety risks such as thermal runaway [
23,
24,
25,
26]. Moreover, non-uniform temperature distribution within battery modules can accelerate cell imbalance and capacity fading [
27,
28,
29,
30]. In addition to immediate thermal safety concerns, battery lifespan degradation has emerged as a critical challenge for fast charging applications. Elevated temperatures and non-uniform temperature distributions accelerate degradation mechanisms, including lithium plating, solid electrolyte interphase (SEI) growth, and electrode structural deterioration, leading to capacity fading and reduced State of Health (SoH) [
31,
32]. Recent studies have proposed adaptive battery thermal management strategies based on SoH, dynamically adjusting thermal control parameters, such as coolant flow rate, in response to battery aging [
33,
34]. These approaches enable more accurate quantification of the relationship between temperature evolution and degradation mechanisms, thereby improving long-term durability. In particular, SoH-aware models provide a theoretical framework for linking thermal behavior with capacity fading, offering significant potential for optimizing dynamic thermal management strategies in LIB systems. In summary, battery lifespan is significantly affected by temperature sensitivity and dynamic fast-charging conditions. By integrating an adaptive flow control strategy based on the SoH, the degradation caused by thermal stress can be further reduced, thereby improving the long-term robustness of the cooling system. Although the present study focuses on thermal–hydraulic optimization, controlling maximum temperature and temperature gradients can be considered key indirect indicators for mitigating long-term degradation. Therefore, the development of an efficient battery thermal management system (BTMS) is essential not only for performance and safety but also for extending battery lifetime [
35,
36,
37,
38].
Various BTMS strategies have been proposed in the literature, including air cooling, phase change material (PCM) cooling, heat pipe cooling, and liquid cooling [
39,
40,
41,
42]. Air cooling systems are simple and cost-effective but suffer from low heat transfer efficiency, making them less suitable for high-power applications [
43,
44,
45,
46]. PCM-based cooling can effectively absorb heat through latent heat storage. However, its limited thermal conductivity and challenges with regeneration limit its performance [
47,
48,
49,
50]. Heat pipe systems offer high thermal conductivity and passive operation, but their integration complexity and cost can limit practical implementation [
51,
52,
53,
54]. Compared to these methods, liquid cooling systems provide superior heat dissipation capability due to higher heat transfer coefficients, making them particularly suitable for high-performance battery systems [
55,
56,
57,
58,
59].
Liquid cooling can be broadly classified into direct and indirect cooling methods [
60,
61,
62]. In direct liquid cooling, the coolant is in direct contact with the battery, offering excellent thermal performance [
63,
64,
65]. However, direct liquid cooling poses risks of leakage and material incompatibility issues [
66,
67,
68]. In contrast, indirect liquid cooling, where the coolant flows through cooling plates attached to the battery surface, offers a safer, more reliable solution while maintaining high cooling efficiency [
69,
70,
71]. Therefore, indirect liquid cooling has become a widely adopted approach in practical EV battery systems and is the focus of the present study.
Several studies have investigated the thermal performance of liquid-cooled battery systems with a cooling plate by analysing key design and operating parameters. You et al. investigated the thermal performance of a liquid cooling plate-cooled BTMS for LIBs by analysing the effects of flow rate, channel width, and channel number. Numerical results showed that increasing the coolant flow rate from 2 to 6 LPM reduced the average battery temperature from 53.8 °C to 50.7 °C, while significantly increasing the P
pump from 0.036 to 0.808 W. Additionally, increasing the channel width and the number of channels improved cooling performance and reduced ΔP. Under optimized conditions, a six-channel configuration achieved enhanced temperature uniformity and lower energy consumption. Furthermore, a combined top-and-bottom cooling strategy reduced the average temperature to 35.8 °C, demonstrating substantial improvement in thermal management effectiveness [
72]. Similarly, Song et al. investigated the thermal performance of liquid-cooled battery cooling plates using numerical simulations combined with grey correlation analysis. Multiple parameters, including runner structure, plate thickness, inlet temperature, and flow rate, were analyzed. The results showed that serpentine channels provide the best heat dissipation but incur the highest ΔP, while leaf-vein structures achieve a better balance between thermal performance and hydraulic losses. Increasing plate thickness slightly reduces T
max and ΔP, whereas higher flow rates improve cooling but significantly increase ΔP. Optimal performance was obtained at a plate thickness of 4 mm, an inlet temperature of 25 °C, and a flow rate of 0.5 m/s, achieving a T
max of 29.99 °C and a ΔT of 4.97 °C [
73].
Building on these parametric studies, recent research has emphasized the importance of flow distribution and structural design. Tang et al. investigated the thermal performance of cooling plates for LIB modules through a validated three-dimensional numerical model. The results demonstrated that the main/side-wall cooling design significantly enhances temperature uniformity, reducing the ΔT from 2.68 °C to 0.68 °C under comparable conditions. Further optimization of the flow channel revealed that a serpentine main channel combined with shortened side-wall channels achieved the best performance by improving coolant distribution, with approximately 81% of the flow directed to the main wall. Under high discharge rates, the optimized structure maintained the ΔT below 5 °C, satisfying thermal safety requirements [
74]. Likewise, Ding et al. proposed a cross-flow microchannel cold plate to address non-uniform flow distribution, local hotspots, and high ΔP in BTMSs. An L16 orthogonal experimental design was employed to investigate the effects of coolant type, inlet temperature, and flow velocity. The results showed that lower inlet temperatures and higher flow velocities significantly enhanced cooling performance but increased ΔP. The optimal configuration reduced the heat source temperature by 2.43 °C while decreasing ΔP by approximately 16.3% compared to reference cases. Additionally, a Nusselt number correlation with R
2 of 0.978 was developed, demonstrating high predictive accuracy for thermal performance [
75].
More advanced design approaches have focused on structural and topology optimization of cooling plates. Ren et al. developed a liquid-cooling plate design based on topology optimization and a biomimetic simplification approach to enhance both thermal performance and manufacturability. A two-dimensional topology optimization method was first employed to generate an optimal flow channel configuration, which was subsequently simplified into a bionic cooling plate (BCP) inspired by streamlined natural structures. Numerical simulations demonstrated that both the topology-optimized cooling plate (TCP) and the BCP significantly outperformed conventional straight mini-channel plates. Specifically, within a flow rate range of 300–600 mLPM, the T
max was maintained below 35 °C, while the ΔT was reduced to 1.17–2.35 °C. Furthermore, the optimized designs improved temperature uniformity by over 70% and reduced T
max by approximately 11–12% compared to traditional designs, while maintaining comparable ΔP characteristics [
76]. Xie et al. proposed a simplified liquid-cooling structure for large-format LIB modules that employs two liquid-cooling plates. A three-dimensional CFD model was developed to evaluate thermal performance under a 3C discharge condition. Through systematic optimization of inlet velocity, channel width, and plate thickness, the study identified that flow channel configuration plays a critical role in temperature uniformity. In particular, a redesigned dual-inlet, single-outlet channel effectively mitigated edge-overcooling and improved coolant distribution. The optimized configuration achieved a T
max below 31.8 °C and a ΔT of approximately 3.7 °C, while maintaining a lightweight structure [
77]. In addition, multi-objective optimization techniques have been increasingly adopted to balance thermal and hydraulic performance. Yang et al. investigated the multi-objective topology optimization of liquid cooling plates for battery thermal management under 5C discharge conditions using a combined RSM-NSGA-II-TOPSIS framework. The study focused on optimizing inlet and outlet configurations to minimize the T
max, the ΔT, and the ΔP. The optimized design reduced the T
max by approximately 0.13–0.22 K compared to conventional layouts and by up to 2.6 K relative to straight-channel designs. In addition, the ΔP decreased by up to 18.75%, while the performance evaluation criteria (PEC) improved by up to 79.4% at higher inlet velocities [
78].
Despite the significant progress in cooling plate design and optimization, several limitations remain in the existing literature. First, many studies focus primarily on either geometric parameters or operating conditions independently, without systematically investigating their coupled effects on thermal and hydraulic performance. Second, although multi-objective optimization methods have been applied, the integration of high-fidelity CFD models with data-driven surrogate models, such as artificial neural networks (ANNs), remains relatively limited, particularly for large-scale and complex battery module configurations. Third, most existing works are conducted at the single-cell or simplified module level, with limited consideration of realistic battery pack configurations and high-rate charging scenarios. Notably, while several studies have investigated high discharge rates, thermal behavior under fast charging conditions has received comparatively less attention, despite its increasing importance in practical EV applications. To address these gaps, the present study proposes a comprehensive multi-objective optimization framework for a cooling plate-based indirect liquid cooling system applied to a large-scale 6S2P LIB module under 3C fast charging conditions. A coupled CFD-MSMD-NTGK model is developed and experimentally validated to accurately capture battery heat generation and thermal behavior. Furthermore, an ANN-based surrogate model is constructed to efficiently predict system performance, and a hybrid NSGA-II-TOPSIS approach is employed to simultaneously optimise T
max, ΔT, and P
pump. By systematically considering the combined effects of channel number, channel width, channel spacing, and coolant mass flow rate, this study provides a more comprehensive and practical design methodology for advanced BTMSs. The remainder of this paper is organised as follows.
Section 2 describes the model development and numerical methodology, including the battery model, boundary conditions, and experimental validation.
Section 3 presents the results and discussion, including parametric analysis and multi-objective optimization results. Finally,
Section 4 summarises the study’s key findings and conclusions.
4. Conclusions
In this study, a comprehensive multi-objective optimization framework was developed to enhance the thermal–hydraulic performance of a cooling plate-based indirect liquid cooling system for a 6S2P LIB module under 3C fast charging conditions. A coupled CFD-MSMD-NTGK model was established and experimentally validated, demonstrating good agreement with measured data and confirming the numerical approach’s reliability.
A detailed parametric analysis revealed that the coolant mass flow rate is the dominant factor affecting system performance. Increasing the flow rate significantly reduced the Tmax and ΔT but resulted in a substantial increase in ΔP and Ppump. In contrast, geometric parameters such as channel number, channel width, and channel spacing exhibited comparatively smaller but still notable effects. Among these, increasing the number of channels improved both thermal performance and flow uniformity, while channel width showed a clear trade-off between heat transfer enhancement and hydraulic losses. Channel spacing was found to have a relatively minor influence on thermal performance but affected ΔP behavior.
To efficiently explore the design space, Latin hypercube sampling was employed to generate training data for ANN-based surrogate models. The developed ANN models demonstrated excellent predictive capability, with high R2 values and low RMSE across all performance metrics. Subsequently, a hybrid NSGA-II-TOPSIS optimization approach was applied to identify the optimal design configuration. The optimal solution was obtained with a coolant channel width of 6.22 mm, a channel spacing of 4.84 mm, and a coolant mass flow rate of 2.55 LPM. Under these conditions, the system achieved a Tmax of 30.47 °C, a ΔT of 4.50 °C, and a Ppump of 0.05879 W, satisfying both thermal safety requirements and energy efficiency criteria.
Furthermore, validation of the optimized results showed that the deviations between ANN predictions and CFD simulations were all below 1%, confirming the robustness and accuracy of the proposed optimization framework. Overall, this study demonstrates that integrating CFD simulation, experimental validation, surrogate modelling, and multi-objective optimization provides an effective and practical methodology for the design of advanced BTMSs. Although the present study focuses on thermal–hydraulic optimization, the obtained improvements in maximum temperature and temperature uniformity are directly related to reducing thermal-induced degradation mechanisms. Battery lifespan is significantly affected by temperature sensitivity and dynamic fast-charging conditions. By integrating an adaptive flow control strategy based on SoH, the degradation caused by thermal stress can be further reduced, thereby improving the long-term robustness of the BTMS. In future work, the proposed framework can be extended by incorporating electrochemical aging models and SoH-based adaptive thermal management strategies. In particular, coupling thermal–hydraulic optimization with SoH-aware dynamic flow control could enable real-time adjustment of coolant flow rates to mitigate degradation under varying operating conditions, thereby enhancing battery lifespan and overall system efficiency. In addition, it should be noted that the present study evaluates energy consumption based on component-level pumping power. A comprehensive assessment of total energy consumption would require including auxiliary systems such as chillers, radiators, and compressors. Future work will focus on integrating the proposed cooling plate design with system-level thermal management models to evaluate overall vehicle energy efficiency and thermal–electric coupling effects.