The global transition towards renewable energy sources has elevated the importance of large-scale energy storage solutions. Among these, Lithium-ion Battery Energy Storage Systems (BESS) are paramount due to their high energy density, rapid response capabilities, and modular scalability. However, a critical challenge impeding their economic viability and widespread adoption is the discrepancy between actual and theoretical energy efficiency. Operational energy losses in practical BESS installations can range from 10% to 20%, significantly impacting the levelized cost of storage. This efficiency gap is predominantly influenced by a complex interplay of operational parameters, including the battery’s State-of-Charge (SOC) voltage window, thermal management strategy, and ambient environmental conditions. While existing models offer insights, they often rely on steady-state laboratory conditions, failing to capture the dynamic, synergistic effects present in real-world grid applications. Therefore, this study aims to investigate and optimize the energy efficiency of a commercial, liquid-cooled battery energy storage system cabinet by comprehensively analyzing the coupled effects of ambient temperature, charge-discharge voltage limits, and active thermal management protocols. We establish a multi-parameter optimization framework to derive a synergistic control strategy that balances high energy efficiency with long-term cycle life, providing actionable engineering guidance for the design and operation of high-performance, grid-scale battery energy storage systems.

1. Introduction and Problem Statement
The integration of intermittent renewable sources like solar and wind into the power grid necessitates robust energy storage for load shifting, frequency regulation, and capacity firming. The battery energy storage system, particularly those based on Lithium Iron Phosphate (LFP) chemistry, has become the technology of choice for many stationary storage applications. The overall performance and economic return of a battery energy storage system are fundamentally governed by two key metrics: Round-Trip Efficiency (RTE) and cycle life. RTE, defined as the ratio of energy discharged to energy charged over a complete cycle, directly affects operational revenue. Cycle life determines the system’s longevity and capital amortization.
Extensive research has identified several dominant loss mechanisms within a battery energy storage system. Internally, energy losses manifest as ohmic losses (I²R), polarization losses (activation and concentration overpotentials), and coulombic inefficiencies from parasitic side reactions. Externally, losses occur in power conversion systems (PCS) and auxiliary loads like thermal management. Our focus is on the battery pack itself, where operational parameters can dramatically influence these internal loss pathways. For instance, operating at extreme states of charge increases polarization. Similarly, temperature profoundly affects ionic conductivity, charge transfer kinetics, and the rate of degradation reactions. An optimal thermal management strategy for a battery energy storage system must therefore maintain an ideal temperature window that minimizes instantaneous losses while mitigating long-term degradation.
Previous studies often isolate these variables. For example, research shows narrowing the voltage window improves longevity but reduces usable energy, while low temperatures increase internal resistance. However, a holistic model that dynamically co-optimizes voltage limits with thermal setpoints under varying ambient conditions is lacking. This study bridges that gap by conducting controlled experiments on a full-scale battery energy storage system cabinet to quantify the individual and combined effects of these factors on system-level RTE.
2. Methodology and Experimental Framework
We conducted experiments on a 372.736 kWh, 1500V DC outdoor liquid-cooled battery energy storage system cabinet. The cabinet integrates eight battery modules (1P52S configuration using 280Ah LFP cells) and a high-voltage box, forming a total string configuration of 1P416S. The nominal cell voltage is 3.2V, with an absolute operating range of 2.6V to 3.6V. The cabinet was connected to a 187 kW Power Conversion System (PCS) for testing. A key component of this battery energy storage system is its active liquid cooling/heating thermal management system, whose logic is central to our investigation.
2.1. Energy Efficiency and Internal Resistance Test Protocols
The core test followed a standardized procedure to measure initial charge/discharge energy and calculate RTE. The battery energy storage system cabinet was placed inside a 30m³ environmental chamber to control ambient temperature (Tamb).
Procedure for Round-Trip Efficiency (RTE):
- Condition the cabinet at target Tamb ± 2°C for 5 hours.
- Perform a full charge at constant power (Prc = 186.368 kW) until any cell reaches the upper voltage limit (U1). Rest for 10 min.
- Perform a full discharge at constant power (Prd = 186.368 kW) until any cell reaches the lower voltage limit (U2). Rest for 10 min. (Steps 2-3 condition the pack to a known state).
- Repeat the full charge (Step 2). Record total charge energy (Ech).
- Repeat the full discharge (Step 3). Record total discharge energy (Edis).
The DC-DC round-trip efficiency (η) of the battery energy storage system cabinet is calculated as:
$$ \eta = \frac{E_{dis}}{E_{ch}} \times 100\% $$
Procedure for DC Internal Resistance (DCIR): DCIR was measured at 50% SOC using a current pulse method. After stabilizing at 50% SOC, a 10-second charge or discharge pulse at 1C rate (280A) was applied. The resistance was calculated from the instantaneous voltage change (ΔV) divided by the current (I).
$$ R_{dc,ch} = \frac{V_{after,pulse} – V_{before,pulse}}{I_{pulse}}, \quad R_{dc,dis} = \frac{V_{before,pulse} – V_{after,pulse}}{I_{pulse}} $$
2.2. Investigated Parameters and Thermal Management Logic
Three primary factors were varied systematically:
- Ambient Temperature (Tamb): Tested from 5°C to 60°C in increments, with the thermal management system disabled to observe the cell’s native thermal response.
- Charge-Discharge Voltage Window (U1, U2): Four windows were tested: 2.6V-3.6V, 2.7V-3.6V, 2.8V-3.6V, and 2.85V-3.6V. The upper limit was fixed at 3.6V, while the lower limit was varied.
- Thermal Management Strategy: The logic for the liquid cooling system is detailed in Table 1. Crucially, the heating mode outlet water temperature (T1) was varied across five schemes (32°C, 30°C, 28°C, 26°C, 24°C) to assess its impact on efficiency and maximum cell temperature.
| Mode | Activation Logic | Deactivation Logic | BMS Command |
|---|---|---|---|
| Cooling | Max cell temp (Tmax) ≥ 25°C AND Avg. cell temp (Tavg) ≥ 24°C | Tmax < 22°C AND Tavg < 21°C | Set chiller outlet to 30°C. |
| Heating | Tmin < 14°C | Tmin ≥ 21°C | Set heater outlet to T1 (varied). |
| 14°C ≤ Tmin < 17°C | Set heater outlet to T2 (24°C). | ||
| 17°C ≤ Tmin < 20°C | Set heater outlet to T3 (22°C). | ||
| Self-Circulation | Cooling/Heating off AND ΔTcell,max-min ≥ 5°C | — | Pump only. |
| Standby | None of the above conditions met | — | System off. |
3. Results and Discussion
3.1. Impact of Ambient Temperature on Battery Energy Storage System Efficiency
With the thermal system off, the battery energy storage system cabinet’s intrinsic efficiency was measured across a 5°C to 60°C ambient range. The results, plotted alongside DC internal resistance, reveal a strong non-linear relationship.
| Ambient Temp. Tamb (°C) | RTE η (%) | Charge DCIR (mΩ) | Discharge DCIR (mΩ) |
|---|---|---|---|
| 5 | 93.15 | 0.685 | 0.702 |
| 15 | 94.82 | 0.598 | 0.611 |
| 25 | 95.52 | 0.532 | 0.543 |
| 35 | 95.88 | 0.485 | 0.495 |
| 45 | 96.05 | 0.451 | 0.461 |
| 50 | 96.10 | 0.440 | 0.449 |
| 55 | 95.92 | 0.445 | 0.456 |
| 60 | 95.65 | 0.455 | 0.467 |
The data shows that RTE increases with temperature up to an optimum point near 50°C, after which it declines. Concurrently, DCIR decreases to a minimum around 50°C before increasing again. This phenomenon is explained by competing electrochemical processes. In the range of 5°C to 50°C, increased temperature enhances ionic conductivity in the electrolyte and accelerates charge-transfer kinetics at the electrodes, reducing polarization overpotentials and ohmic losses. The relationship can be modeled by a polynomial regression:
$$ \eta(T) = 0.86129 + 0.00779T – 2.91487 \times 10^{-4}T^2 + 8.03825 \times 10^{-6}T^3 – 1.63113 \times 10^{-7}T^4 + 2.12115 \times 10^{-9}T^5 – 1.25571 \times 10^{-11}T^6 $$
with a high coefficient of determination (R² = 0.996). Above 50°C, the increase in DCIR and drop in efficiency are attributed to accelerated degradation mechanisms. Elevated temperatures promote the decomposition of the Solid Electrolyte Interphase (SEI), catalyze parasitic reactions between the electrolyte and electrodes, and cause electrolyte loss. These processes increase interfacial impedance and reduce coulombic efficiency, manifesting as a rise in DCIR and a net decrease in energy efficiency. This underscores a critical constraint for any battery energy storage system: while moderate heating improves performance, sustained operation above a cell-specific threshold (≈50°C for this LFP chemistry) is detrimental.
3.2. Impact of Voltage Window on Battery Energy Storage System Performance
Fixing the upper voltage limit at 3.6V (to avoid Li-plating risks), we investigated the effect of the lower discharge cutoff voltage (U2) on both RTE and total discharge energy. The results present a fundamental trade-off.
| Voltage Window (V) | RTE η (%) | Discharge Energy Edis (kWh) |
|---|---|---|
| 2.85 – 3.60 | 95.85 | 372.80 |
| 2.80 – 3.60 | 95.24 | 373.64 |
| 2.70 – 3.60 | 94.31 | 375.76 |
| 2.60 – 3.60 | 93.36 | 376.40 |
The data reveals an inverse relationship: a wider voltage window (lower U2) increases the usable discharge energy but reduces the round-trip efficiency. The drop in efficiency is primarily due to increased polarization losses at low voltages. As the cell discharges deeply, the lithium concentration in the LiFePO4 cathode increases, leading to stronger Li⁺-Li⁺ repulsion forces within the solid phase. This significantly hampers solid-state diffusion, increasing concentration polarization and the effective internal resistance. The energy lost as heat during this high-polarization discharge phase is substantial. Therefore, selecting the operational voltage window for a battery energy storage system is not merely about accessing the full capacity; it is a critical optimization between energy throughput (kWh delivered) and efficiency (percentage of energy retained). For this specific system, targeting an RTE > 95% and discharge energy > 372.7 kWh, the 2.8V-3.6V window represents the optimal compromise.
3.3. Synergistic Optimization via Thermal Management
The active thermal management system is the actuator that regulates the cell temperature, linking ambient conditions to the desired electrochemical state. We evaluated five different heating setpoints (T1) while operating the battery energy storage system cabinet within the optimal 2.8V-3.6V window. The results, shown in Table 4, demonstrate the nuanced role of thermal control.
| Heating Scheme (T1 °C) | Charge Energy (kWh) | Discharge Energy (kWh) | RTE η (%) | Max Cell Temp. Tmax (°C) |
|---|---|---|---|---|
| 32 | 389.95 | 371.65 | 95.31 | 43.2 |
| 30 | 390.52 | 372.11 | 95.29 | 39.8 |
| 28 | 390.98 | 372.23 | 95.21 | 37.1 |
| 26 | 391.25 | 371.89 | 95.04 | 34.5 |
| 24 | 391.60 | 371.45 | 94.86 | 31.9 |
A higher heating setpoint T1 warms the cells more aggressively at the beginning of the cycle. According to the Arrhenius equation, this elevates the reaction rate constant, reducing initial internal resistance and improving efficiency slightly, as seen by the higher RTE for T1=30°C and 32°C. However, this initial boost comes with a significant risk: the maximum cell temperature (Tmax) during operation also rises. For T1=32°C, Tmax exceeded 40°C. Literature consistently indicates that while LFP cycle life improves with temperature up to approximately 40°C, operation above this threshold accelerates degradation due to the mechanisms described earlier (SEI growth, electrolyte oxidation).
Thus, the thermal management strategy must solve a dual-objective optimization: maximize instantaneous efficiency (favors warmer cells) and maximize cycle life (favors Tmax ≤ 40°C). From our data, the heating setpoint T1 = 30°C achieves this balance. It provides sufficient pre-heating to maintain high RTE (95.29%) while ensuring Tmax (39.8°C) remains below the critical 40°C degradation threshold. This finding is crucial for the energy management system (EMS) software controlling a battery energy storage system; the thermal setpoints must be dynamically adjustable, not just for safety, but as a core parameter for lifetime-economic optimization.
4. Integrated Multi-Parameter Optimization Strategy
Based on the experimental findings, we propose a synergistic control strategy for maximizing the performance of a grid-connected battery energy storage system. This strategy dynamically coordinates electrical and thermal parameters:
- Voltage Window Management: Operate the LFP cells within a 2.8V to 3.6V window under normal conditions. This provides an optimal balance of high round-trip efficiency (>95%) and sufficient discharge energy. The window can be temporarily narrowed for system balancing or if cell degradation mandates it.
- Ambient Temperature Awareness & Thermal Control: The battery energy storage system controller should use the established polynomial model η(T) to predict efficiency based on ambient forecasts. The active thermal management system should be engaged proactively.
- When Tamb is low (<15°C), the heating mode with T1 = 30°C should be used to raise cell temperatures into the high-efficiency zone.
- When Tamb is moderate (15-30°C), self-circulation or minimal cooling is likely sufficient to maintain cells near 25-35°C.
- When Tamb is high (>30°C), aggressive cooling is required to prevent cells from exceeding 40°C, even if this incurs some auxiliary power consumption. The cooling setpoint should be tuned to maintain Tmax ≤ 40°C.
- Lifetime-Efficiency Co-optimization: The primary control law for the thermal system should be to maintain the maximum cell temperature between 25°C and 40°C, with 30-35°C as a sweet spot. The secondary objective is temperature uniformity (min ΔT). This approach directly links the thermal management strategy to the long-term financial model of the battery energy storage system asset.
The net effect of implementing this multi-parameter strategy—ambient-aware operation, a 2.8V-3.6V voltage window, and a thermal setpoint targeting Tmax~40°C—is a demonstrable system-level RTE of 95.29% under test conditions, representing a significant improvement over broader voltage windows or non-optimized thermal control.
5. Conclusion and Future Outlook
This study provides a comprehensive, experimental analysis of the key factors governing the energy efficiency of a commercial liquid-cooled battery energy storage system. We have quantitatively demonstrated that:
- Ambient temperature has a non-linear, concave relationship with RTE, with an optimum near 50°C for the LFP cells tested, beyond which degradation-induced losses dominate.
- The selection of the operational voltage window involves a critical trade-off between energy throughput and efficiency. A narrower window (e.g., 2.8V-3.6V) optimizes for high RTE and is preferable for daily cycling applications where efficiency revenue is significant.
- The thermal management strategy is not merely a safety subsystem but a primary efficiency and lifetime control lever. An optimal heating setpoint (T1 = 30°C for this system) can pre-condition cells to reduce losses while strictly enforcing an upper temperature limit (40°C) to preserve cycle life.
The proposed integrated optimization framework synthesizes these insights into actionable control parameters for battery energy storage system operators and designers. Future work should focus on implementing this strategy in a real-time, adaptive battery management system (BMS) and energy management system (EMS) that considers aging effects. As the battery energy storage system degrades, the optimal voltage window and thermal setpoints may shift. Developing machine learning models to dynamically update these parameters based on real-time impedance measurements and historical data will be the next frontier in maximizing the lifetime value of grid-scale battery energy storage systems.
