State of Health Estimation of Batteries Using a Time-Informed Dynamic Sequence-Inverted Transformer
- URL: http://arxiv.org/abs/2507.18320v1
- Date: Thu, 24 Jul 2025 11:43:46 GMT
- Title: State of Health Estimation of Batteries Using a Time-Informed Dynamic Sequence-Inverted Transformer
- Authors: Janak M. Patel, Milad Ramezankhani, Anirudh Deodhar, Dagnachew Birru,
- Abstract summary: Batteries play a central role in the efficiency and safety of energy storage systems, yet they inevitably degrade over time due to repeated charge-discharge cycles.<n> Accurate estimation of a State of Health (SoH) of battery is therefore essential for ensuring operational reliability and safety.<n>We propose a novel architecture: Time-Informed Dynamic Sequence Inverted Transformer (TIDSIT)<n> Experimental results on the NASA battery degradation dataset show that TIDSIT significantly outperforms existing models, achieving over 50% reduction in prediction error and maintaining an SoH prediction error below 0.58%.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: The rapid adoption of battery-powered vehicles and energy storage systems over the past decade has made battery health monitoring increasingly critical. Batteries play a central role in the efficiency and safety of these systems, yet they inevitably degrade over time due to repeated charge-discharge cycles. This degradation leads to reduced energy efficiency and potential overheating, posing significant safety concerns. Accurate estimation of a State of Health (SoH) of battery is therefore essential for ensuring operational reliability and safety. Several machine learning architectures, such as LSTMs, transformers, and encoder-based models, have been proposed to estimate SoH from discharge cycle data. However, these models struggle with the irregularities inherent in real-world measurements: discharge readings are often recorded at non-uniform intervals, and the lengths of discharge cycles vary significantly. To address this, most existing approaches extract features from the sequences rather than processing them in full, which introduces information loss and compromises accuracy. To overcome these challenges, we propose a novel architecture: Time-Informed Dynamic Sequence Inverted Transformer (TIDSIT). TIDSIT incorporates continuous time embeddings to effectively represent irregularly sampled data and utilizes padded sequences with temporal attention mechanisms to manage variable-length inputs without discarding sequence information. Experimental results on the NASA battery degradation dataset show that TIDSIT significantly outperforms existing models, achieving over 50% reduction in prediction error and maintaining an SoH prediction error below 0.58%. Furthermore, the architecture is generalizable and holds promise for broader applications in health monitoring tasks involving irregular time-series data.
Related papers
- BatteryBERT for Realistic Battery Fault Detection Using Point-Masked Signal Modeling [1.7397173676239939]
We propose a novel framework that adapts BERT-style pretraining for battery fault detection.<n>We extend the standard BERT architecture with a customized time-series-to-token representation module and a point-level Masked Signal Modeling (point-MSM) pretraining task.<n>This approach enables self-supervised learning on sequential current, voltage and other charge-discharge cycle data.
arXiv Detail & Related papers (2025-05-31T06:06:08Z) - Learning to fuse: dynamic integration of multi-source data for accurate battery lifespan prediction [0.0]
This study presents a hybrid learning framework for precise battery lifespan prediction.<n>It integrates dynamic multi-source data fusion with a stacked ensemble (SE) modeling approach.<n>It achieves a mean absolute error (MAE) of 0.0058, root mean square error (RMSE) of 0.0092, and coefficient of determination (R2) of 0.9839.
arXiv Detail & Related papers (2025-04-25T10:24:45Z) - Powerformer: A Transformer with Weighted Causal Attention for Time-series Forecasting [50.298817606660826]
We introduce Powerformer, a novel Transformer variant that replaces noncausal attention weights with causal weights that are reweighted according to a smooth heavy-tailed decay.<n>Our empirical results demonstrate that Powerformer achieves state-of-the-art accuracy on public time-series benchmarks.<n>Our analyses show that the model's locality bias is amplified during training, demonstrating an interplay between time-series data and power-law-based attention.
arXiv Detail & Related papers (2025-02-10T04:42:11Z) - Rough Transformers: Lightweight and Continuous Time Series Modelling through Signature Patching [46.58170057001437]
We introduce the Rough Transformer, a variation of the Transformer model that operates on continuous-time representations of input sequences.<n>We find that, on a variety of time-series-related tasks, Rough Transformers consistently outperform their vanilla attention counterparts.
arXiv Detail & Related papers (2024-05-31T14:00:44Z) - Rough Transformers for Continuous and Efficient Time-Series Modelling [46.58170057001437]
Time-series data in real-world medical settings typically exhibit long-range dependencies and are observed at non-uniform intervals.
We introduce the Rough Transformer, a variation of the Transformer model which operates on continuous-time representations of input sequences.
We find that Rough Transformers consistently outperform their vanilla attention counterparts while obtaining the benefits of Neural ODE-based models.
arXiv Detail & Related papers (2024-03-15T13:29:45Z) - Remaining useful life prediction of Lithium-ion batteries using spatio-temporal multimodal attention networks [4.249657064343807]
Lithium-ion batteries are widely used in various applications, including electric vehicles and renewable energy storage.
The prediction of the remaining useful life (RUL) of batteries is crucial for ensuring reliable and efficient operation.
This paper proposes a two-stage RUL prediction scheme for Lithium-ion batteries using a-temporal attention network (ST-MAN)
arXiv Detail & Related papers (2023-10-29T07:32:32Z) - Towards Long-Term Time-Series Forecasting: Feature, Pattern, and
Distribution [57.71199089609161]
Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning.
Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism.
We propose an efficient Transformerbased model, named Conformer, which differentiates itself from existing methods for LTTF in three aspects.
arXiv Detail & Related papers (2023-01-05T13:59:29Z) - Grouped self-attention mechanism for a memory-efficient Transformer [64.0125322353281]
Real-world tasks such as forecasting weather, electricity consumption, and stock market involve predicting data that vary over time.
Time-series data are generally recorded over a long period of observation with long sequences owing to their periodic characteristics and long-range dependencies over time.
We propose two novel modules, Grouped Self-Attention (GSA) and Compressed Cross-Attention (CCA)
Our proposed model efficiently exhibited reduced computational complexity and performance comparable to or better than existing methods.
arXiv Detail & Related papers (2022-10-02T06:58:49Z) - Dynaformer: A Deep Learning Model for Ageing-aware Battery Discharge
Prediction [2.670887944566458]
We introduce a novel Transformer-based deep learning architecture which is able to simultaneously infer the ageing state from a limited number of voltage/current samples.
Our experiments show that the trained model is effective for input current profiles of different complexities and is robust to a wide range of degradation levels.
arXiv Detail & Related papers (2022-06-01T15:31:06Z) - Non-stationary Transformers: Exploring the Stationarity in Time Series
Forecasting [86.33543833145457]
We propose Non-stationary Transformers as a generic framework with two interdependent modules: Series Stationarization and De-stationary Attention.
Our framework consistently boosts mainstream Transformers by a large margin, which reduces MSE by 49.43% on Transformer, 47.34% on Informer, and 46.89% on Reformer.
arXiv Detail & Related papers (2022-05-28T12:27:27Z) - Lithium-ion Battery State of Health Estimation based on Cycle
Synchronization using Dynamic Time Warping [13.19976118887128]
State of health (SOH) estimation plays an essential role in battery-powered applications to avoid unexpected breakdowns due to battery capacity fading.
This paper proposes an innovative cycle synchronization way to change the existing coordinate system using dynamic time warping.
By exploiting the time information of the time series, the proposed method embeds the time index and the original measurements into a novel indicator to reflect the battery degradation status.
arXiv Detail & Related papers (2021-09-28T02:53:54Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - Universal Battery Performance and Degradation Model for Electric
Aircraft [52.77024349608834]
Design, analysis, and operation of electric vertical takeoff and landing aircraft (eVTOLs) requires fast and accurate prediction of Li-ion battery performance.
We generate a battery performance and thermal behavior dataset specific to eVTOL duty cycles.
We use this dataset to develop a battery performance and degradation model (Cellfit) which employs physics-informed machine learning.
arXiv Detail & Related papers (2020-07-06T16:10:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.