In recent years, the rapid advancement of renewable energy sources has underscored the critical need for efficient and reliable energy storage technologies to address imbalances in energy supply and demand. Among various solutions, liquid-cooled energy storage battery PACKs have gained prominence due to their high energy density and superior thermal management capabilities, making them indispensable in modern energy storage systems. The introduction of stringent national standards, such as GB/T 36276—2023, which imposes stricter requirements on the energy efficiency of battery PACKs and clusters compared to previous versions, has further emphasized the importance of optimizing performance. In this study, we investigate the factors influencing the charge-discharge energy efficiency of liquid-cooled energy storage battery PACKs through systematic experimental testing. Our focus is on analyzing the effects of key parameters, including the automatic cooling temperature of the chiller unit, the initial average temperature of energy storage cells, and the charge-discharge voltage range, to provide theoretical insights for the design and management of energy storage systems.
The experimental setup involved a liquid-cooled energy storage battery PACK comprising 52 series-connected lithium iron phosphate energy storage cells, each with a capacity of 314 Ah. The operating voltage range for each energy storage cell was between 2.5 V and 3.65 V, with a nominal voltage of 3.2 V. Standard charge and discharge tests were conducted at a constant power of 0.5P (502.4 W), following the guidelines outlined in GB/T 36276—2023. A high-precision charge-discharge tester was employed to record energy data, while a custom-built chiller unit was used for cooling the battery PACK. The energy efficiency was calculated using the formula: $$\eta = \frac{\int_0^{t_d} U_d(t) I_d(t) dt}{\int_0^{t_c} U_c(t) I_c(t) dt} \times 100\%$$ where \(U_c\) and \(U_d\) represent the average voltages during charging and discharging, respectively, \(I_c\) and \(I_d\) denote the corresponding average currents, and \(t_c\) and \(t_d\) are the charging and discharging times. Each test condition was repeated three times, and the average values were used for analysis to ensure reliability.

To evaluate the impact of the chiller’s automatic cooling temperature on energy efficiency, tests were performed at temperatures of 18°C, 20°C, 22°C, 24°C, and 26°C. The battery PACK was stabilized at each temperature for over 8 hours before conducting charge-discharge cycles. The results, summarized in Table 1, indicate a gradual improvement in energy efficiency as the cooling temperature increased. However, this was accompanied by a rise in the voltage difference at the end of charge and discharge, which eventually limited further gains in efficiency. For instance, at 18°C, the energy efficiency was 93.68%, while at 24°C, it reached 95.07%. Beyond this point, the increase in energy efficiency plateaued, with only a marginal improvement to 95.16% at 26°C. This behavior can be attributed to reduced polarization effects in the energy storage cells at higher temperatures, which minimize energy losses. Nonetheless, excessive temperatures exacerbate voltage inconsistencies, leading to premature termination of charge-discharge cycles and reduced energy output. Thus, an optimal chiller automatic cooling temperature of approximately 24°C is recommended for balancing energy efficiency and voltage stability.
| Cooling Temperature (°C) | Charging Energy (kWh) | Discharging Energy (kWh) | Energy Efficiency | Charging End Voltage Difference (mV) | Discharging End Voltage Difference (mV) |
|---|---|---|---|---|---|
| 18 | 55.72 | 52.20 | 93.68% | 110 | 138 |
| 20 | 55.62 | 52.55 | 94.48% | 125 | 162 |
| 22 | 55.83 | 52.96 | 94.86% | 136 | 153 |
| 24 | 56.03 | 53.27 | 95.07% | 151 | 207 |
| 26 | 55.95 | 53.24 | 95.16% | 211 | 293 |
Next, we examined the influence of the initial average temperature of energy storage cells on energy efficiency. The initial average temperature refers to the mean temperature of all 52 energy storage cells before charging, which was adjusted by modifying the chiller’s temperature setpoint. Tests were conducted at initial average temperatures of 25°C, 29°C, and 33°C, with the results presented in Table 2. The data reveal a non-linear relationship: energy efficiency initially increased with temperature, peaking at 29°C, and then decreased. At 25°C, the energy efficiency was 93.95%, which rose to 94.47% at 29°C, but dropped to 94.03% at 33°C. This trend is primarily due to the internal temperature dynamics of the energy storage cells during charging. As the cells heat up, the initial temperature’s influence diminishes, particularly affecting charging energy while having a minimal impact on discharging energy. The optimal initial average temperature for maximizing energy efficiency was found to be around 29°C, highlighting the importance of thermal management in enhancing the performance of energy storage cells.
| Initial Average Temperature (°C) | Charging Energy (kWh) | Discharging Energy (kWh) | Energy Efficiency | Charging End Voltage Difference (mV) | Discharging End Voltage Difference (mV) |
|---|---|---|---|---|---|
| 25 | 56.40 | 52.99 | 93.95% | 121 | 138 |
| 29 | 56.07 | 52.97 | 94.47% | 118 | 153 |
| 33 | 56.43 | 53.06 | 94.03% | 116 | 118 |
Furthermore, we investigated the effect of the charge-discharge voltage range on energy efficiency. Using the standard range of 2.5 V to 3.65 V as a baseline, we tested narrower intervals, including 2.5 V to 3.55 V, 2.9 V to 3.65 V, and 2.9 V to 3.55 V. The tests were conducted under optimal conditions of a chiller automatic cooling temperature of 24°C and an initial average temperature of 29°C for the energy storage cells. The results, detailed in Table 3, demonstrate that reducing the voltage range significantly improves energy efficiency and decreases the voltage difference at the end of charge and discharge, with only a minor reduction in total energy. For example, in the 2.9 V to 3.55 V range, energy efficiency reached 94.91%, compared to 94.47% in the baseline range, while the charging and discharging end voltage differences were reduced by over 50%. This improvement is linked to the voltage plateau characteristics of lithium iron phosphate energy storage cells, where the middle stage of charge-discharge cycles exhibits minimal voltage variation, accounting for over 95% of the energy. In contrast, the extremes of the voltage range experience sharp increases in voltage difference, intensifying internal resistance polarization and reducing energy efficiency. The relationship between voltage range and energy efficiency can be modeled using the formula: $$\eta(V) = \eta_0 – k \cdot (V_{\text{max}} – V_{\text{min}})^2$$ where \(\eta_0\) is the baseline efficiency, \(k\) is a constant, and \(V_{\text{max}}\) and \(V_{\text{min}}\) are the maximum and minimum voltages, respectively. This emphasizes the benefits of operating within a optimized voltage window to enhance the overall performance of energy storage cells.
| Charge-Discharge Range (V) | Charging Energy (kWh) | Discharging Energy (kWh) | Energy Efficiency | Charging End Voltage Difference (mV) | Discharging End Voltage Difference (mV) |
|---|---|---|---|---|---|
| 2.5–3.65 | 56.07 | 52.97 | 94.47% | 118 | 153 |
| 2.5–3.55 | 55.76 | 52.81 | 94.71% | 50 | 151 |
| 2.9–3.65 | 55.61 | 52.53 | 94.46% | 114 | 51 |
| 2.9–3.55 | 54.47 | 51.70 | 94.91% | 52 | 53 |
In-depth analysis of the polarization effects in energy storage cells reveals that internal resistance plays a crucial role in energy efficiency. The polarization voltage \(V_p\) can be expressed as: $$V_p = I \cdot R_i$$ where \(I\) is the current and \(R_i\) is the internal resistance, which is temperature-dependent. As temperature increases, \(R_i\) decreases, reducing polarization and improving efficiency. However, at higher temperatures, the inconsistency among energy storage cells leads to increased voltage differences, as observed in our experiments. This inconsistency can be quantified by the standard deviation of cell voltages: $$\sigma_V = \sqrt{\frac{1}{N} \sum_{i=1}^{N} (V_i – \bar{V})^2}$$ where \(N\) is the number of energy storage cells, \(V_i\) is the voltage of the i-th cell, and \(\bar{V}\) is the average voltage. Our data show that \(\sigma_V\) increases with temperature, correlating with the rise in end voltage differences. Additionally, the energy loss due to polarization can be calculated as: $$E_{\text{loss}} = \int I^2 R_i dt$$ which highlights the trade-off between reduced internal resistance and increased voltage inconsistency. To optimize energy efficiency, we propose a comprehensive model that considers both thermal and electrical factors: $$\eta_{\text{opt}} = f(T, V_{\text{range}}, I)$$ where \(T\) is temperature, \(V_{\text{range}}\) is the voltage range, and \(I\) is current. This model can guide the design of control strategies for energy storage systems, ensuring that energy storage cells operate within their optimal parameters.
In conclusion, our experimental research on liquid-cooled energy storage battery PACKs demonstrates that energy efficiency is significantly influenced by the chiller automatic cooling temperature, the initial average temperature of energy storage cells, and the charge-discharge voltage range. We found that an optimal chiller temperature of around 24°C balances efficiency gains with voltage stability. The initial average temperature of energy storage cells should be maintained at approximately 29°C to maximize efficiency, as higher temperatures lead to diminishing returns. Moreover, narrowing the charge-discharge voltage range effectively enhances energy efficiency by minimizing polarization effects at the voltage extremes. These findings provide valuable insights for improving the performance of energy storage systems, emphasizing the importance of integrated thermal and electrical management. Future work could explore dynamic control algorithms that adapt these parameters in real-time to further optimize the efficiency and longevity of energy storage cells in practical applications.
