KODA: A Data-Driven Recursive Model for Time Series Forecasting and Data Assimilation using Koopman Operators
- URL: http://arxiv.org/abs/2409.19518v1
- Date: Sun, 29 Sep 2024 02:25:48 GMT
- Title: KODA: A Data-Driven Recursive Model for Time Series Forecasting and Data Assimilation using Koopman Operators
- Authors: Ashutosh Singh, Ashish Singh, Tales Imbiriba, Deniz Erdogmus, Ricardo Borsoi,
- Abstract summary: We propose a Koopman operator-based approach that integrates forecasting and data assimilation in nonlinear dynamical systems.
In particular we use a Fourier domain filter to disentangle the data into a physical component whose dynamics can be accurately represented by a Koopman operator.
We show that KODA outperforms existing state of the art methods on multiple time series benchmarks.
- Score: 14.429071321401953
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Approaches based on Koopman operators have shown great promise in forecasting time series data generated by complex nonlinear dynamical systems (NLDS). Although such approaches are able to capture the latent state representation of a NLDS, they still face difficulty in long term forecasting when applied to real world data. Specifically many real-world NLDS exhibit time-varying behavior, leading to nonstationarity that is hard to capture with such models. Furthermore they lack a systematic data-driven approach to perform data assimilation, that is, exploiting noisy measurements on the fly in the forecasting task. To alleviate the above issues, we propose a Koopman operator-based approach (named KODA - Koopman Operator with Data Assimilation) that integrates forecasting and data assimilation in NLDS. In particular we use a Fourier domain filter to disentangle the data into a physical component whose dynamics can be accurately represented by a Koopman operator, and residual dynamics that represents the local or time varying behavior that are captured by a flexible and learnable recursive model. We carefully design an architecture and training criterion that ensures this decomposition lead to stable and long-term forecasts. Moreover, we introduce a course correction strategy to perform data assimilation with new measurements at inference time. The proposed approach is completely data-driven and can be learned end-to-end. Through extensive experimental comparisons we show that KODA outperforms existing state of the art methods on multiple time series benchmarks such as electricity, temperature, weather, lorenz 63 and duffing oscillator demonstrating its superior performance and efficacy along the three tasks a) forecasting, b) data assimilation and c) state prediction.
Related papers
- Koopman AutoEncoder via Singular Value Decomposition for Data-Driven Long-Term Prediction [31.853422606200382]
Controlling eigenvalues is challenging due to high computational complexity and difficulties in managing them during the training process.
We propose leveraging the singular value decomposition (SVD) of the Koopman matrix to adjust the singular values for better long-term prediction.
Experimental results demonstrate that, during training, the loss term for singular values effectively brings the eigenvalues close to the unit circle, and the proposed approach outperforms existing baseline methods for long-term prediction tasks.
arXiv Detail & Related papers (2024-08-21T03:15:37Z) - Physics-guided Active Sample Reweighting for Urban Flow Prediction [75.24539704456791]
Urban flow prediction is a nuanced-temporal modeling that estimates the throughput of transportation services like buses, taxis and ride-driven models.
Some recent prediction solutions bring remedies with the notion of physics-guided machine learning (PGML)
We develop a atized physics-guided network (PN), and propose a data-aware framework Physics-guided Active Sample Reweighting (P-GASR)
arXiv Detail & Related papers (2024-07-18T15:44:23Z) - Temporally-Consistent Koopman Autoencoders for Forecasting Dynamical Systems [42.6886113798806]
We introduce the Temporally-Consistent Koopman Autoencoder (tcKAE)
tcKAE generates accurate long-term predictions even with constrained and noisy training data.
We demonstrate tcKAE's superior performance over state-of-the-art KAE models across a variety of test cases.
arXiv Detail & Related papers (2024-03-19T00:48:25Z) - Online Variational Sequential Monte Carlo [49.97673761305336]
We build upon the variational sequential Monte Carlo (VSMC) method, which provides computationally efficient and accurate model parameter estimation and Bayesian latent-state inference.
Online VSMC is capable of performing efficiently, entirely on-the-fly, both parameter estimation and particle proposal adaptation.
arXiv Detail & Related papers (2023-12-19T21:45:38Z) - Time-series Generation by Contrastive Imitation [87.51882102248395]
We study a generative framework that seeks to combine the strengths of both: Motivated by a moment-matching objective to mitigate compounding error, we optimize a local (but forward-looking) transition policy.
At inference, the learned policy serves as the generator for iterative sampling, and the learned energy serves as a trajectory-level measure for evaluating sample quality.
arXiv Detail & Related papers (2023-11-02T16:45:25Z) - Generative Modeling of Regular and Irregular Time Series Data via Koopman VAEs [50.25683648762602]
We introduce Koopman VAE, a new generative framework that is based on a novel design for the model prior.
Inspired by Koopman theory, we represent the latent conditional prior dynamics using a linear map.
KoVAE outperforms state-of-the-art GAN and VAE methods across several challenging synthetic and real-world time series generation benchmarks.
arXiv Detail & Related papers (2023-10-04T07:14:43Z) - Koopman Invertible Autoencoder: Leveraging Forward and Backward Dynamics
for Temporal Modeling [13.38194491846739]
We propose a novel machine learning model based on Koopman operator theory, which we call Koopman Invertible Autoencoders (KIA)
KIA captures the inherent characteristic of the system by modeling both forward and backward dynamics in the infinite-dimensional Hilbert space.
This enables us to efficiently learn low-dimensional representations, resulting in more accurate predictions of long-term system behavior.
arXiv Detail & Related papers (2023-09-19T03:42:55Z) - Explainable Parallel RCNN with Novel Feature Representation for Time
Series Forecasting [0.0]
Time series forecasting is a fundamental challenge in data science.
We develop a parallel deep learning framework composed of RNN and CNN.
Extensive experiments on three datasets reveal the effectiveness of our method.
arXiv Detail & Related papers (2023-05-08T17:20:13Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - Progressive Growing of Neural ODEs [7.558546277131641]
We propose a progressive learning paradigm of NODEs for long-term time series forecasting.
Specifically, following the principle of curriculum learning, we gradually increase the complexity of data and network capacity as training progresses.
Our experiments with both synthetic data and real traffic data (PeMS Bay Area traffic data) show that our training methodology consistently improves the performance of vanilla NODEs by over 64%.
arXiv Detail & Related papers (2020-03-08T01:15:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.