Koopman AutoEncoder via Singular Value Decomposition for Data-Driven Long-Term Prediction
- URL: http://arxiv.org/abs/2408.11303v1
- Date: Wed, 21 Aug 2024 03:15:37 GMT
- Title: Koopman AutoEncoder via Singular Value Decomposition for Data-Driven Long-Term Prediction
- Authors: Jinho Choi, Sivaram Krishnan, Jihong Park,
- Abstract summary: Controlling eigenvalues is challenging due to high computational complexity and difficulties in managing them during the training process.
We propose leveraging the singular value decomposition (SVD) of the Koopman matrix to adjust the singular values for better long-term prediction.
Experimental results demonstrate that, during training, the loss term for singular values effectively brings the eigenvalues close to the unit circle, and the proposed approach outperforms existing baseline methods for long-term prediction tasks.
- Score: 31.853422606200382
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: The Koopman autoencoder, a data-driven technique, has gained traction for modeling nonlinear dynamics using deep learning methods in recent years. Given the linear characteristics inherent to the Koopman operator, controlling its eigenvalues offers an opportunity to enhance long-term prediction performance, a critical task for forecasting future trends in time-series datasets with long-term behaviors. However, controlling eigenvalues is challenging due to high computational complexity and difficulties in managing them during the training process. To tackle this issue, we propose leveraging the singular value decomposition (SVD) of the Koopman matrix to adjust the singular values for better long-term prediction. Experimental results demonstrate that, during training, the loss term for singular values effectively brings the eigenvalues close to the unit circle, and the proposed approach outperforms existing baseline methods for long-term prediction tasks.
Related papers
- KODA: A Data-Driven Recursive Model for Time Series Forecasting and Data Assimilation using Koopman Operators [14.429071321401953]
We propose a Koopman operator-based approach that integrates forecasting and data assimilation in nonlinear dynamical systems.
In particular we use a Fourier domain filter to disentangle the data into a physical component whose dynamics can be accurately represented by a Koopman operator.
We show that KODA outperforms existing state of the art methods on multiple time series benchmarks.
arXiv Detail & Related papers (2024-09-29T02:25:48Z) - Analysis of Truncated Singular Value Decomposition for Koopman Operator-Based Lane Change Model [0.0]
Singular Value Decomposition (SVD) is employed to approximate Koopman operators from extensive datasets efficiently.
This study evaluates different basis functions used in EDMD and ranks for truncated SVD for representing lane change behavior models.
The findings, however, suggest that the technique of truncated SVD does not necessarily achieve substantial reductions in computational training time and results in significant information loss.
arXiv Detail & Related papers (2024-09-27T09:45:21Z) - Temporally-Consistent Koopman Autoencoders for Forecasting Dynamical Systems [42.6886113798806]
We introduce the Temporally-Consistent Koopman Autoencoder (tcKAE)
tcKAE generates accurate long-term predictions even with constrained and noisy training data.
We demonstrate tcKAE's superior performance over state-of-the-art KAE models across a variety of test cases.
arXiv Detail & Related papers (2024-03-19T00:48:25Z) - Koopman Invertible Autoencoder: Leveraging Forward and Backward Dynamics
for Temporal Modeling [13.38194491846739]
We propose a novel machine learning model based on Koopman operator theory, which we call Koopman Invertible Autoencoders (KIA)
KIA captures the inherent characteristic of the system by modeling both forward and backward dynamics in the infinite-dimensional Hilbert space.
This enables us to efficiently learn low-dimensional representations, resulting in more accurate predictions of long-term system behavior.
arXiv Detail & Related papers (2023-09-19T03:42:55Z) - Improving Adaptive Conformal Prediction Using Self-Supervised Learning [72.2614468437919]
We train an auxiliary model with a self-supervised pretext task on top of an existing predictive model and use the self-supervised error as an additional feature to estimate nonconformity scores.
We empirically demonstrate the benefit of the additional information using both synthetic and real data on the efficiency (width), deficit, and excess of conformal prediction intervals.
arXiv Detail & Related papers (2023-02-23T18:57:14Z) - Modeling Nonlinear Dynamics in Continuous Time with Inductive Biases on
Decay Rates and/or Frequencies [37.795752939016225]
We propose a neural network-based model for nonlinear dynamics in continuous time that can impose inductive biases on decay rates and frequencies.
We use neural networks to find an appropriate Koopman space, which are trained by minimizing multi-step forecasting and backcasting errors using irregularly sampled time-series data.
arXiv Detail & Related papers (2022-12-26T08:08:43Z) - DeepVol: Volatility Forecasting from High-Frequency Data with Dilated Causal Convolutions [53.37679435230207]
We propose DeepVol, a model based on Dilated Causal Convolutions that uses high-frequency data to forecast day-ahead volatility.
Our empirical results suggest that the proposed deep learning-based approach effectively learns global features from high-frequency data.
arXiv Detail & Related papers (2022-09-23T16:13:47Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - Evaluating Prediction-Time Batch Normalization for Robustness under
Covariate Shift [81.74795324629712]
We call prediction-time batch normalization, which significantly improves model accuracy and calibration under covariate shift.
We show that prediction-time batch normalization provides complementary benefits to existing state-of-the-art approaches for improving robustness.
The method has mixed results when used alongside pre-training, and does not seem to perform as well under more natural types of dataset shift.
arXiv Detail & Related papers (2020-06-19T05:08:43Z) - Forecasting Sequential Data using Consistent Koopman Autoencoders [52.209416711500005]
A new class of physics-based methods related to Koopman theory has been introduced, offering an alternative for processing nonlinear dynamical systems.
We propose a novel Consistent Koopman Autoencoder model which, unlike the majority of existing work, leverages the forward and backward dynamics.
Key to our approach is a new analysis which explores the interplay between consistent dynamics and their associated Koopman operators.
arXiv Detail & Related papers (2020-03-04T18:24:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.