Abstract
The state of charge (SOC) of electrochemical energy storage batteries represents the percentage of remaining usable electricity in the battery compared to its total capacity. It is a crucial estimate for ensuring the safe operation of energy storage stations. Energy storage systems consist of multiple individual battery cells, which exhibit certain differences. Traditional methods focused on individual cells often struggle to accurately estimate the SOC of all cells in an energy storage battery pack. This paper proposes a network for memory feature fusion and SOC estimation that considers both individual cell memory and cluster features. Experimental verification on a real-world energy storage dataset shows that our method significantly improves the accuracy of SOC estimation for individual battery cells within the energy storage cluster.

1. Introduction
Energy storage batteries play a vital role in modern power systems, enabling the efficient utilization of renewable energy sources and enhancing the stability of the grid. However, accurately estimating the SOC of energy storage batteries, especially in large-scale battery clusters, remains a challenging problem. This paper addresses this challenge by proposing a novel method that integrates individual cell memory and cluster features for fine-grained SOC estimation.
2. Literature Review
Previous research on SOC estimation has primarily focused on individual battery cells. Methods such as the coulomb counting method, Kalman filter-based methods, and data-driven methods like support vector regression (SVR), multilayer perceptron (MLP), and long short-term memory (LSTM) networks have been widely used. However, these methods often struggle to accurately estimate the SOC of all cells in a battery pack due to variations among individual cells and the collective charging and discharging processes.
3. Proposed Method
To address the limitations of previous methods, this paper proposes a memory feature fusion and SOC estimation method for energy storage battery clusters. The overall network model consists of five main components: an individual cell shared encoder, a cluster encoder, a memory module, a feature fusion layer, and an SOC estimator.
3.1 Network Architecture
Table 1: Network Architecture Overview
Component | Layer | Structure/Parameters | Output Shape |
---|---|---|---|
Data Input Module | Individual Input Layer | Unfold Layer [16,224,32*3] | – |
Cluster Expert Knowledge Layer | Expert Knowledge Layer [16,32,11] | [16,32*11] | |
Individual Cell Shared Encoder | Residual MLP Layer 1 | Dropout=0.1, Hidden Layer Size=96 | [16,224,96] |
Fully Connected Layer | Mapping Nodes 96*32 | [16,224,32] | |
Cluster Encoder | Residual MLP Layer 1 | Dropout=0.1, Hidden Layer Size=96 | [16,96] |
Residual MLP Layer 2 | Dropout=0.1, Hidden Layer Size=96 | [16,96] | |
Fully Connected Layer | Mapping Nodes 96*32 | [16,32] | |
Memory Module | Memory Read & Update Layer | Item Length=32 | [16,224,32] |
Feature Fusion Layer | Cluster Attention | Mapping Nodes 32*16 | [16,224,16] |
SOC Estimator | Fully Connected Layer | Mapping Nodes 32*16 | [16,224,16] |
Activation Function | ReLu Function | [16,224,16] | |
Fully Connected Layer | Mapping Nodes 32*1 | [16,224,1] |
3.2 Feature Engineering
The input features for the network include both individual cell features and cluster features. Individual cell features are obtained from voltage, temperature, and shared current data points. Cluster features are constructed using expert knowledge, including statistics such as mean, variance, skewness, peak-to-peak value, and kurtosis based on battery pack operation measurement data.
Table 2: Cluster Hand-crafted Features Table
Constructed Feature | Calculation Formula |
---|---|
Mean | N1∑i=1Nxi |
Variance | N1∑i=1N(xi−mean)2 |
Skewness | N1∑i=1N(stdxi−mean)3 |
Peak-to-Peak Value | max(xi)−min(xi) |
Kurtosis | N1∑i=1N(stdxi−mean)4−3 |
3.3 Memory Module
The memory module is introduced to remember key individual cell features, enabling the network to capture typical battery operation patterns and individual differences. Memory features are obtained by querying the memory module with individual cell features, and are updated during training.
3.4 Feature Fusion
The cluster attention mechanism is proposed for feature fusion. It combines cluster features, memory features, and individual cell features to obtain final fused features for each cell. The cluster attention mechanism borrows from the self-attention structure, using linear mappings and attention weights to fuse the features.
3.5 Model Training
The network is trained end-to-end using the mean squared error (MSE) loss between the estimated SOC and the true SOC values. Additionally, losses for memory feature compactness and separation are included to ensure the diversity and typicality of memory features.
4. Experimental Results
4.1 Dataset and Evaluation Criteria
The proposed method is validated using a real-world lithium iron phosphate energy storage battery dataset. The dataset consists of data from a battery cluster with 224 cells in series, each with voltage, temperature, and shared current data points, collected at a one-minute interval. Five rounds of charging data are used for training, and one round for testing.
The evaluation criteria include mean squared error (MSE), mean absolute percentage error (MAPE), maximum error (Max), and fast charging cell error (Fast), which represent the overall, relative, worst-case, and fast-charging cell performance of the method, respectively.
Table 3: Comparison of CMCF Method with Other Methods
Method | MSE (10−4) | MAPE (%) | Max (10−2) | Fast (10−4) |
---|---|---|---|---|
SVR | High Value | High Value | High Value | High Value |
MLP | High Value | High Value | High Value | High Value |
LSTM | High Value | High Value | High Value | High Value |
CMCF (Ours) | 1.52 | 1.72 | 3.70 | 1.87 |
4.2 Ablation Study
To further validate the effectiveness of the proposed method, an ablation study is conducted. The results show that each component of the method contributes to improving the SOC estimation accuracy. In particular, the cluster features play a crucial role, as removing them leads to a significant decrease in performance.
Table 4: Ablation Experiment Comparison Results
Method | MSE (10−4) | MAPE (%) | Max (10−2) | Fast (10−4) |
---|---|---|---|---|
w/o CF | 2.99 | 7.81 | 9.41 | 3.88 |
w/o Memory | 2.36 | 2.16 | 4.86 | 2.97 |
w/o Cluster Attention | 3.36 | 2.62 | 3.88 | 3.82 |
w/o CF & Memory | 5.94 | 3.13 | 2.96 | 1.38 |
CMCF (Full) | 1.52 | 1.72 | 3.70 | 1.87 |
5. Conclusion
The memory feature fusion and SOC estimation method for energy storage battery clusters. By integrating individual cell memory and cluster features, the method achieves fine-grained SOC estimation for multiple cells.