In the contemporary energy landscape, the integration of renewable sources has necessitated robust and reliable energy storage solutions. Among these, lithium-ion battery energy storage systems have emerged as a dominant technology due to their high energy density, scalability, and declining costs. However, the operational safety and longevity of a battery energy storage system are intrinsically linked to its thermal management. The electrochemical processes within lithium-ion cells are highly temperature-sensitive. Elevated temperatures accelerate degradation mechanisms and pose safety risks, while low temperatures increase internal resistance, reduce efficiency, and can lead to lithium plating. Therefore, ensuring the thermal stability of a battery energy storage system across a wide range of environmental conditions is a critical engineering challenge. This study focuses on evaluating the thermal adaptability of a liquid-cooled battery pack, a core subunit within a larger battery energy storage system, under extreme temperature scenarios through a combined approach of numerical simulation and experimental validation.
The design and validation of thermal management systems for a battery energy storage system often rely on computational fluid dynamics and heat transfer simulations. These tools allow for the prediction of temperature distributions under various loads and environments before physical prototyping. However, the accuracy of such simulations hinges on the fidelity of the model and the correctness of input parameters, such as material properties and boundary conditions. In this work, we first establish a credible simulation model by correlating its predictions with empirical data from a controlled laboratory test at room temperature. Once validated, this model is employed to extrapolate the thermal behavior of the battery energy storage system pack to extreme environmental conditions—specifically, a high-temperature scenario of 45°C and a low-temperature scenario of -20°C. This methodology provides a cost-effective means to assess the resilience of the battery energy storage system’s thermal design without the prohibitive expense and logistical difficulty of full-scale environmental chamber testing.

The fundamental heat transfer processes within a battery energy storage system pack can be described by the energy conservation equation. For a control volume representing a battery cell, the governing equation is:
$$
\rho c_p \frac{\partial T}{\partial t} = \nabla \cdot (k \nabla T) + \dot{q}_{gen}
$$
where \( \rho \) is the density, \( c_p \) is the specific heat capacity, \( T \) is the temperature, \( t \) is time, \( k \) is the thermal conductivity tensor, and \( \dot{q}_{gen} \) is the volumetric heat generation rate. The heat generation within a lithium-ion cell during operation arises from irreversible (ohmic) and reversible (entropic) sources. A common simplified model for the total heat generation rate \( Q_{cell} \) is:
$$
Q_{cell} = I(V_{ocv} – V) + I T \frac{\partial V_{ocv}}{\partial T}
$$
Here, \( I \) is the current, \( V_{ocv} \) is the open-circuit voltage, \( V \) is the terminal voltage, and \( \frac{\partial V_{ocv}}{\partial T} \) is the entropy coefficient. For our simulation, we used an averaged constant heat generation value derived from experimental measurements during the final steady-state phase of discharge, which is a conservative approach for evaluating peak temperatures.
The subject of this analysis is a commercial liquid-cooled battery pack designed for a large-scale battery energy storage system. The pack contains 48 prismatic lithium iron phosphate (LFP) cells arranged in a 4 series by 12 parallel configuration (4S12P). Each cell has a nominal capacity of 280 Ah. The pack employs a bottom-mounted liquid cooling plate. The cooling plate surface is coated with an insulating layer and bonded to the cells using thermally conductive adhesive. The busbars interconnect the cells and are also significant sources of joule heating. The primary cooling medium is a 50% ethylene glycol-water mixture. The key operational parameters for the baseline (room temperature) analysis are summarized in Table 1.
| Parameter | Value | Description |
|---|---|---|
| Cell Arrangement | 4S12P | Series-parallel configuration |
| Cell Nominal Capacity | 280 Ah | Per lithium iron phosphate cell |
| Charge/Discharge Rate (C-rate) | 0.5C | Standard operational cycle |
| Cell Heat Generation (Steady Discharge) | 12 W | Average per cell during final discharge phase |
| Busbar Current | 140 A | Current through interconnecting busbars |
| Coolant Flow Rate | 5 L/min | Through the liquid cooling plate |
| Coolant Inlet Temperature (Baseline) | 20 °C | Controlled by external chiller |
| Ambient Temperature (Baseline) | 25 °C | Room temperature condition |
| Ambient Convective Coefficient | 2-5 W/(m²·K) | Natural convection on pack surfaces |
To perform the thermal simulation, a three-dimensional computational model of the battery energy storage system pack was created. Geometrically non-essential components such as wiring harnesses were omitted, and small fillets/chamfers were simplified to reduce mesh complexity without sacrificing accuracy for major heat paths. The model included detailed representations of cells, busbars, copper terminals, the liquid cooling plate, and the structural components. The material properties assigned to these components are critical inputs and are listed in Table 2. These values were sourced from material datasheets and literature, with some later adjusted within physical reasonable bounds during the model validation phase.
| Component | Thermal Conductivity (W/(m·K)) | Density (kg/m³) | Specific Heat Capacity (J/(kg·K)) | Notes |
|---|---|---|---|---|
| Lithium-ion Cell (LFP) | kx,y=20, kz=1.2 | 2800 | 1000 | Anisotropic; higher in-plane conductivity |
| Aluminum Busbar | 237 | 2702 | 903 | Primary electrical interconnect |
| Copper Terminal | 400 | 8960 | 385 | High conductivity for current collection |
| Aluminum Cooling Plate | 193 | 2800 | 880 | Extruded aluminum with internal channels |
| Coolant (50% EG-Water) | 0.384 | 1071.1 | 3300 | Temperature-dependent properties averaged |
| Thermal Interface Material (Adhesive) | 1.2 | 1900 | 270 | Fills gap between cell and cooling plate |
| Electrical Insulation Layer | 0.1 | 900 | 1000 | Thin coating on cooling plate |
| FR4 PCB (for BMS) | 0.3 | 1200 | 1100 | Printed circuit board material |
| Epoxy Resin (Structural) | 0.2 | 2000 | 1100 | Used in module framing |
| Foam (Spacer/Insulation) | 0.05 | 10 | 1300 | Very low density, high thermal resistance |
The boundary conditions for the simulation were set as follows. A constant heat flux was applied to each cell volume, corresponding to the 12 W heat generation. Joule heating in the busbars and copper terminals was modeled by assigning an electrical resistivity and solving a coupled electrical-thermal problem, where the heat generation rate \( \dot{q}_{joule} \) is given by:
$$
\dot{q}_{joule} = I^2 R_{effective}
$$
The cooling plate was modeled with a fluid domain. A mass flow inlet boundary condition was set at the coolant inlet with a flow rate of 5 L/min and a fixed temperature of 20°C for the baseline case. The external surfaces of the battery energy storage system pack were subjected to a convective heat transfer boundary condition with the ambient air, using a heat transfer coefficient range of 2-5 W/(m²·K) to account for natural convection. Radiation effects were considered negligible for this analysis due to moderate temperature differences. The steady-state energy equation was solved using a finite volume method in a commercial CFD software suite until residuals converged below 1e-6.
The initial simulation under baseline conditions (25°C ambient) predicted the temperature distribution within the pack. The results indicated that the hottest regions were located on the longer, thinner copper terminals at the rear of the pack due to their higher electrical resistance. The volume-weighted average temperature of the cells was calculated, but for direct comparison with the battery management system (BMS) measurements, the surface temperatures of the busbars—where the BMS temperature sensors are typically attached—were extracted. The simulation predicted busbar surface temperatures ranging from 31.5°C to 35.6°C.
To validate the simulation model, an identical battery energy storage system pack was instrumented and tested under the same operational conditions in a climate-controlled laboratory. The ambient temperature was maintained at 25°C. Sixteen temperature sensors (PT1000 or NTC thermistors) were attached to the surfaces of various busbars throughout the pack, providing a representative sample. The pack underwent a full charge-discharge cycle at 0.5C rate, with a 30-minute rest period after charging. The coolant was supplied at a constant 20°C. The temperature data from the final 30 minutes of the discharge phase, when heat generation is relatively stable, was averaged for each sensor location. These experimental values were then compared to the corresponding nodal temperatures from the simulation. The comparison is presented in Table 3.
| Sensor Location (Representative) | Experimental Average Temperature (°C) | Simulation Prediction (°C) | Absolute Error (°C) |
|---|---|---|---|
| T1 (Positive Terminal) | 33.17 | 32.01 | 1.16 |
| T2 | 32.64 | 33.78 | 1.14 |
| T3 | 32.96 | 33.74 | 0.78 |
| T4 | 34.76 | 35.64 | 0.88 |
| T5 | 34.70 | 35.62 | 0.92 |
| T6 | 32.79 | 33.89 | 1.10 |
| T7 | 32.93 | 33.79 | 0.86 |
| T8 | 33.96 | 34.43 | 0.47 |
| T9 | 33.61 | 34.50 | 0.89 |
| T10 | 33.65 | 34.18 | 0.53 |
| T11 | 33.06 | 34.04 | 0.98 |
| T12 | 34.64 | 35.61 | 0.97 |
| T13 | 34.35 | 35.59 | 1.24 |
| T14 | 32.97 | 34.14 | 1.17 |
| T15 | 33.36 | 34.20 | 0.84 |
| T16 (Negative Terminal) | 32.25 | 31.53 | 0.72 |
The absolute errors were all within 1.24°C, with most under 1°C. Discrepancies at the main terminals (T1, T16) were attributed to simplifications in modeling the contact resistance at bolted joints and the presence of insulating sleeves. The close agreement confirmed that the simulation model, with its chosen parameters, accurately represented the thermal behavior of the physical battery energy storage system pack. This validated model thus serves as a reliable digital twin for investigating performance under untested, extreme conditions.
With confidence in the model, we proceeded to simulate two extreme environmental scenarios relevant to the global deployment of battery energy storage systems. Scenario 1 represents a harsh summer condition, such as in a desert or tropical region, where the ambient temperature inside a passively ventilated enclosure could reach 45°C. In this simulation, we kept all internal parameters identical to the baseline—the cells were discharging at 0.5C with 12 W heat generation each, and the coolant inlet temperature was still controlled to 20°C. The ambient temperature for all external convective boundaries was set to 45°C. Scenario 2 represents a severe winter condition, with an ambient temperature of -20°C. In this case, the battery energy storage system pack was assumed to be in a standby or storage mode; therefore, the cell heat generation was set to zero (\( \dot{q}_{gen} = 0 \)). The coolant system was considered inactive for heating, so the only heat exchange was between the pack and the cold environment. The governing equation for the low-temperature case simplifies as the system tends toward thermal equilibrium with the ambient:
$$
\rho c_p \frac{\partial T}{\partial t} = \nabla \cdot (k \nabla T)
$$
with the initial condition being a uniform pack temperature equal to the ambient and boundary conditions representing heat loss to the -20°C environment. The steady-state solution indicates the minimum temperatures the cells would reach after prolonged exposure.
The results of these extreme condition simulations are summarized in Table 4. For the high-temperature scenario, the focus is on the maximum cell temperatures to assess the risk of entering a hazardous thermal state. For the low-temperature scenario, the minimum cell temperatures are critical to evaluate the risk of operation below the recommended lower limit, which for LFP chemistry is typically around 0°C.
| Environmental Scenario | Cell Operational State | Average Cell Temperature (°C) | Maximum Cell Temperature (°C) | Minimum Cell Temperature (°C) | Temperature Non-uniformity (ΔTmax-min, °C) |
|---|---|---|---|---|---|
| High-Temperature (45°C Ambient) | Discharging at 0.5C | 39.2 | 41.2 | 35.7 | 5.5 |
| Low-Temperature (-20°C Ambient) | Standby (No Load) | 7.8 | 11.2 | 3.7 | 7.5 |
The results indicate that the thermal management system of this particular battery energy storage system pack is effective in maintaining cell temperatures within a generally acceptable operating window even under these extremes. In the 45°C environment, the cells’ average temperature is 39.2°C, with a peak of 41.2°C. While this is within the common upper operating limit of 45-50°C for LFP cells, it is approaching the range where accelerated aging occurs. The temperature rise above ambient (ΔT = average Tcell – Tambient) is approximately -5.8°C, which is negative because the active cooling (20°C coolant) is effectively removing heat. The cooling performance can be quantified by a coefficient of performance (COP) for the thermal system, though not calculated here. In the -20°C environment, without any internal heat generation or active heating, the cells cool down significantly but do not reach sub-zero temperatures internally. The average cell temperature stabilizes at 7.8°C, with the coldest spot at 3.7°C. This is above the freezing point of the electrolyte and avoids the severe performance penalties and lithium plating risks associated with below-zero operation. The temperature non-uniformity is higher in the low-T case (7.5°C vs. 5.5°C) because without internal heat generation, the temperature distribution is governed solely by the geometry and insulation, leading to larger gradients from the pack’s core to its edges.
The analysis reveals that the design possesses inherent thermal resilience. However, for long-term deployment of a battery energy storage system in consistently extreme climates, further optimizations could be considered. For high-temperature regions, enhancing the cooling capacity could lower the peak cell temperature, thereby extending cycle life. This could involve increasing the coolant flow rate, lowering the coolant inlet temperature (if chiller capacity permits), or improving the thermal interface between the cell and cooling plate. The effectiveness of the cooling plate can be evaluated by its thermal resistance, \( R_{th} \), defined as:
$$
R_{th} = \frac{T_{cell,avg} – T_{coolant,in}}{Q_{total}}
$$
where \( Q_{total} \) is the total heat load from the pack. Reducing \( R_{th} \) through design changes would directly lower \( T_{cell,avg} \). For low-temperature regions, the primary concern is to ensure cells remain above a minimum temperature for safe charging. The simulation shows that in standby, the pack’s internal heat retention (due to insulation from structural materials and air gaps) keeps cells above 0°C even in -20°C ambient. However, if the battery energy storage system needs to operate (charge/discharge) at such low ambient temperatures, an active heating strategy would be mandatory. Pre-heating using the coolant loop with an integrated heater before charging is a common solution. The required heating power \( P_{heat} \) to maintain a target cell temperature \( T_{target} \) can be estimated from a steady-state heat loss calculation:
$$
P_{heat} = U A (T_{target} – T_{ambient})
$$
where \( U \) is the overall heat transfer coefficient of the pack enclosure, and \( A \) is the effective surface area.
This study underscores the utility of high-fidelity thermal simulation as a powerful tool in the design and validation cycle of a battery energy storage system. By first calibrating the model against empirical data, we create a reliable predictive tool. This tool can then be used to explore “what-if” scenarios—like extreme weather conditions—that are difficult, expensive, or time-consuming to test physically. This approach is invaluable for ensuring the reliability and safety of battery energy storage systems destined for diverse global markets. Future work could involve transient analysis simulating daily temperature cycles, integrating battery electro-thermal-aging models to quantify lifetime impact, and exploring the system-level thermal interactions between multiple packs in a full containerized battery energy storage system.
In conclusion, the thermal management system of the investigated liquid-cooled battery pack, a fundamental building block of a larger battery energy storage system, demonstrates satisfactory adaptability to both high-temperature (45°C) and low-temperature (-20°C) extreme environments. The validated numerical model predicts that during operation in high heat, cell temperatures remain below critical limits, primarily due to the efficacy of the active liquid cooling. In extreme cold with no load, the pack’s inherent thermal inertia and insulation prevent the cells from falling below 0°C, though active heating would be required for operation. These findings provide confidence in the deployment of such battery energy storage system technology across a wide climatic range. Moreover, the methodology presented—combining targeted experimentation with comprehensive simulation—offers a robust framework for the thermal design and evaluation of future battery energy storage systems, enabling the development of safer, longer-lasting, and more geographically versatile energy storage solutions.
