DeepKoopFormer: A Koopman Enhanced Transformer Based Architecture for Time Series Forecasting
- URL: http://arxiv.org/abs/2508.02616v1
- Date: Mon, 04 Aug 2025 17:05:55 GMT
- Title: DeepKoopFormer: A Koopman Enhanced Transformer Based Architecture for Time Series Forecasting
- Authors: Ali Forootani, Mohammad Khosravi, Masoud Barati,
- Abstract summary: Time series forecasting plays a vital role across scientific, industrial, and environmental domains.<n>DeepKoopFormer is a principled forecasting framework that combines the representational power of Transformers with the theoretical rigor of Koopman operator theory.<n>Our model features a modular encoder-propagator-decoder structure, where temporal dynamics are learned via a spectrally constrained, linear Koopman operator in a latent space.
- Score: 0.9217021281095907
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Time series forecasting plays a vital role across scientific, industrial, and environmental domains, especially when dealing with high-dimensional and nonlinear systems. While Transformer-based models have recently achieved state-of-the-art performance in long-range forecasting, they often suffer from interpretability issues and instability in the presence of noise or dynamical uncertainty. In this work, we propose DeepKoopFormer, a principled forecasting framework that combines the representational power of Transformers with the theoretical rigor of Koopman operator theory. Our model features a modular encoder-propagator-decoder structure, where temporal dynamics are learned via a spectrally constrained, linear Koopman operator in a latent space. We impose structural guarantees-such as bounded spectral radius, Lyapunov based energy regularization, and orthogonal parameterization to ensure stability and interpretability. Comprehensive evaluations are conducted on both synthetic dynamical systems, real-world climate dataset (wind speed and surface pressure), financial time series (cryptocurrency), and electricity generation dataset using the Python package that is prepared for this purpose. Across all experiments, DeepKoopFormer consistently outperforms standard LSTM and baseline Transformer models in terms of accuracy, robustness to noise, and long-term forecasting stability. These results establish DeepKoopFormer as a flexible, interpretable, and robust framework for forecasting in high dimensional and dynamical settings.
Related papers
- Transformer with Koopman-Enhanced Graph Convolutional Network for Spatiotemporal Dynamics Forecasting [12.301897782320967]
TK-GCN is a two-stage framework that integrates geometry-aware spatial encoding with long-range temporal modeling.<n>We show that TK-GCN consistently delivers superior predictive accuracy across a range of forecast horizons.
arXiv Detail & Related papers (2025-07-05T01:26:03Z) - Multivariate Long-term Time Series Forecasting with Fourier Neural Filter [55.09326865401653]
We introduce FNF as the backbone and DBD as architecture to provide excellent learning capabilities and optimal learning pathways for spatial-temporal modeling.<n>We show that FNF unifies local time-domain and global frequency-domain information processing within a single backbone that extends naturally to spatial modeling.
arXiv Detail & Related papers (2025-06-10T18:40:20Z) - Synthetic Time Series Forecasting with Transformer Architectures: Extensive Simulation Benchmarks [1.03590082373586]
Time series forecasting plays a critical role in domains such as energy, finance, and healthcare.<n>Autoformer, Informer, and Patchtst-each evaluated through three architectural variants.<n>Koopman-enhanced Transformer framework, Deep Koopformer, integrates operator-theoretic latent state modeling.
arXiv Detail & Related papers (2025-05-26T14:34:05Z) - PRformer: Pyramidal Recurrent Transformer for Multivariate Time Series Forecasting [82.03373838627606]
Self-attention mechanism in Transformer architecture requires positional embeddings to encode temporal order in time series prediction.
We argue that this reliance on positional embeddings restricts the Transformer's ability to effectively represent temporal sequences.
We present a model integrating PRE with a standard Transformer encoder, demonstrating state-of-the-art performance on various real-world datasets.
arXiv Detail & Related papers (2024-08-20T01:56:07Z) - Temporally Consistent Koopman Autoencoders for Forecasting Dynamical Systems [38.36312939874359]
We introduce the temporally consistent Koopman autoencoder (tcKAE)<n>tcKAE generates accurate long-term predictions even with limited and noisy training data.<n>We empirically demonstrate tcKAE's superior performance over state-of-the-art KAE models across a variety of test cases.
arXiv Detail & Related papers (2024-03-19T00:48:25Z) - Generative Modeling of Regular and Irregular Time Series Data via Koopman VAEs [50.25683648762602]
We introduce Koopman VAE, a new generative framework that is based on a novel design for the model prior.
Inspired by Koopman theory, we represent the latent conditional prior dynamics using a linear map.
KoVAE outperforms state-of-the-art GAN and VAE methods across several challenging synthetic and real-world time series generation benchmarks.
arXiv Detail & Related papers (2023-10-04T07:14:43Z) - Koopa: Learning Non-stationary Time Series Dynamics with Koopman
Predictors [85.22004745984253]
Real-world time series are characterized by intrinsic non-stationarity that poses a principal challenge for deep forecasting models.
We tackle non-stationary time series with modern Koopman theory that fundamentally considers the underlying time-variant dynamics.
We propose Koopa as a novel Koopman forecaster composed of stackable blocks that learn hierarchical dynamics.
arXiv Detail & Related papers (2023-05-30T07:40:27Z) - An Interpretable Approach to Load Profile Forecasting in Power Grids using Galerkin-Approximated Koopman Pseudospectra [0.3160121582090025]
This paper presents an interpretable machine learning approach that characterizes dynamics within an operator-theoretic framework for electricity forecasting in power grids.<n>We represent the dynamics of load data using the Koopman operator, which provides a linear, infinite-dimensional representation of the nonlinear dynamics.<n>Our approach captures temporal coherent patterns due to seasonal changes and finer time scales, such as time of day and day of the week.
arXiv Detail & Related papers (2023-04-16T16:56:52Z) - Towards Long-Term Time-Series Forecasting: Feature, Pattern, and
Distribution [57.71199089609161]
Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning.
Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism.
We propose an efficient Transformerbased model, named Conformer, which differentiates itself from existing methods for LTTF in three aspects.
arXiv Detail & Related papers (2023-01-05T13:59:29Z) - Mitigating Data Redundancy to Revitalize Transformer-based Long-Term Time Series Forecasting System [46.39662315849883]
We introduce CLMFormer, a novel framework that mitigates redundancy through curriculum learning and a memory-driven decoder.<n>CLMFormer consistently improves Transformer-based models by up to 30%, demonstrating its effectiveness in long-horizon forecasting.
arXiv Detail & Related papers (2022-07-16T04:05:15Z) - Forecasting Sequential Data using Consistent Koopman Autoencoders [52.209416711500005]
A new class of physics-based methods related to Koopman theory has been introduced, offering an alternative for processing nonlinear dynamical systems.
We propose a novel Consistent Koopman Autoencoder model which, unlike the majority of existing work, leverages the forward and backward dynamics.
Key to our approach is a new analysis which explores the interplay between consistent dynamics and their associated Koopman operators.
arXiv Detail & Related papers (2020-03-04T18:24:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.