Factors Influencing Energy Efficiency of Energy Storage Battery Cabinets

In the context of the global energy transition towards renewable sources, lithium-ion energy storage systems have become pivotal for enhancing grid flexibility due to their high energy density, rapid response capabilities, and modular deployment. However, in practical engineering applications, the energy efficiency of energy storage battery cabinets often falls below theoretical expectations, posing a significant challenge to their economic viability and large-scale adoption. Studies indicate that efficiency losses over the lifecycle of energy storage systems can range from 10% to 20%, with factors such as the charge-discharge voltage range, thermal management strategies, and ambient temperature being particularly critical. This paper aims to investigate the synergistic effects of these parameters on the energy efficiency of energy storage cells under complex operational conditions, with the goal of developing optimized strategies based on dynamic parameter calibration.

Our research focuses on a 372.736 kWh outdoor liquid-cooled energy storage battery cabinet operating at 1500V, utilizing a 1P52S configuration with lithium iron phosphate (LiFePO4) energy storage cells of 280Ah capacity. The nominal voltage per cell is 3.2V, with an operational range of 2.6V to 3.6V. The experimental setup included key equipment such as an energy storage converter, a walk-in temperature chamber, a dry-type transformer, and a soft-start cabinet, as summarized in Table 1. The tests were conducted to evaluate initial charge-discharge energy and DC internal resistance under varying conditions, with energy efficiency calculated as the ratio of initial discharge energy to initial charge energy, measured at the DC side without accounting for auxiliary power consumption.

Table 1: Experimental Equipment and Key Parameters
No. Equipment Brand Model Key Parameters
1 Energy Storage Converter TBEA TE186K-HV Rated Power: 187kW
2 30m³ Walk-in Chamber Jiangsu Ruilan RLH-30MW-L55H85-ICE Temperature Range: -55°C to 85°C
3 Dry-Type Transformer TBEA XCS500-0.4-0.69 Input Voltage: 400V, Output Voltage: 690V
4 Soft-Start Cabinet TBEA GGD Rated Power: 500kVA
5 Liquid Cooler Envicool EMW600HCNC1B Cooling Capacity: 8kW
6 Temperature Acquisition Keysight Control Accuracy: < ±0.1°C

The experimental methodology involved a series of steps to assess energy efficiency and DC internal resistance. For initial charge-discharge energy tests, the energy storage battery cabinet was placed in the temperature chamber set to specific ambient temperatures (T1). After stabilization, the cabinet was charged at a constant power of 186.368 kW until any cell reached the charge termination voltage (U1), followed by a 10-minute rest, and then discharged at the same power to the discharge termination voltage (U2). This cycle was repeated to calculate energy efficiency, with measurements taken for power, current, temperature, and energy values. DC internal resistance tests were conducted at 50% state of charge (SOC) by applying constant current pulses and measuring voltage changes, using the formulas: charging DC internal resistance $$ R_{rc} = \frac{U_4 – U_3}{I_{rc}} $$ and discharging DC internal resistance $$ R_{rd} = \frac{U_6 – U_5}{I_{rd}} $$, where I_{rc} and I_{rd} are 280A.

Thermal management strategies were implemented using a liquid cooling system, with operational modes including cooling, heating, self-circulation, and shutdown, as detailed in Table 2. The strategies were designed to maintain optimal temperature conditions for the energy storage cells, with thresholds based on maximum (T_max), average (T_avg), and minimum (T_min) cell temperatures. For instance, the heating mode was activated when T_min fell below specific setpoints, with outlet water temperatures (T1, T2, T3) adjusted to optimize performance.

Table 2: Thermal Management Strategy Logic
Mode Activation Condition Deactivation Condition Command to Cooler
Cooling T_max ≥ 25°C and T_avg ≥ 24°C T_max < 22°C and T_avg < 21°C Set outlet temperature to 30°C for cooling
Heating T_min < 14°C T_min ≥ 21°C Set outlet temperature to T1 for heating
Heating 14°C ≤ T_min < 17°C T_min ≥ 21°C Set outlet temperature to T2 for heating
Heating 17°C ≤ T_min < 20°C T_min ≥ 21°C Set outlet temperature to T3 for heating
Self-Circulation Heating/cooling deactivation and ΔT ≥ 5°C Activate pump only
Shutdown No active mode conditions Send shutdown command

The impact of ambient temperature on the energy efficiency of energy storage cells was evaluated by testing the battery cabinet across a range from 5°C to 60°C, with charge-discharge voltages set to 2.8V and 3.6V, and the liquid cooler deactivated. Results demonstrated that energy efficiency increased with temperature in the 5°C to 50°C range, but declined beyond 50°C due to accelerated side reactions in the electrolyte. The relationship between ambient temperature (X) and energy efficiency (Y) was modeled using a polynomial equation: $$ Y = 0.86129 + 0.00779X – 2.91487 \times 10^{-4} X^2 + 8.03825 \times 10^{-6} X^3 – 1.63113 \times 10^{-7} X^4 + 2.12115 \times 10^{-9} X^5 – 1.25571 \times 10^{-11} X^6 $$ with a coefficient of determination R² = 0.99606. Concurrently, DC internal resistance measurements showed a decrease with rising temperature up to 50°C, followed by an increase, attributed to electrode expansion and electrolyte loss at higher temperatures.

To assess the influence of charge-discharge voltage range, multiple energy storage battery cabinets were tested with varying voltage windows: 2.6V–3.6V, 2.7V–3.6V, 2.8V–3.6V, and 2.85V–3.6V. Findings indicated that as the discharge termination voltage decreased, energy efficiency declined, but discharge energy increased. For instance, the 2.8V–3.6V range yielded an energy efficiency of 95.24% and discharge energy of 373.64 kWh, meeting design targets of ≥95% efficiency and ≥372.736 kWh discharge energy. In contrast, narrower ranges like 2.85V–3.6V resulted in lower discharge energy (372.8 kWh), while wider ranges such as 2.6V–3.6V led to efficiencies below 94%. This behavior is linked to increased polarization losses at lower voltages, where Li+ diffusion in the FePO4 phase becomes hindered, elevating internal resistance. Thus, the 2.8V–3.6V range was identified as optimal for balancing efficiency and energy output in energy storage cells.

Thermal management strategies were further investigated by testing the battery cabinet under different heating mode outlet temperatures (T1 values from 24°C to 32°C), with charge-discharge voltages fixed at 2.8V–3.6V. Results revealed that higher initial battery temperatures improved energy efficiency and discharge energy due to enhanced reaction kinetics, as described by the Arrhenius equation. However, excessive temperatures risked reducing cycle life; specifically, temperatures above 40°C can degrade LiFePO4 energy storage cells by promoting SEI film decomposition and electrode side reactions. Therefore, a T1 setting of 30°C was recommended to maintain maximum cell temperatures below 40°C, ensuring a balance between efficiency and longevity. This approach aligns with literature indicating that cycle life peaks around 40°C for LiFePO4 cells.

In summary, our study demonstrates that the energy efficiency of energy storage battery cabinets is significantly influenced by ambient temperature, charge-discharge voltage range, and thermal management strategies. The optimal multi-parameter strategy—comprising ambient temperature ≤50°C, voltage window of 2.8V–3.6V, and heating mode outlet temperature T1 = 30°C—achieved an energy efficiency of 95.29% in engineering tests. This integrated “thermal-electric” optimization framework provides a robust foundation for designing high-efficiency energy storage systems, supporting the integration of large-scale renewable energy into power grids. Future work could explore dynamic adjustments based on real-time operational data to further enhance the performance and lifespan of energy storage cells.

Scroll to Top