Forecasting Sequential Data using Consistent Koopman Autoencoders
- URL: http://arxiv.org/abs/2003.02236v2
- Date: Tue, 30 Jun 2020 06:22:57 GMT
- Title: Forecasting Sequential Data using Consistent Koopman Autoencoders
- Authors: Omri Azencot and N. Benjamin Erichson and Vanessa Lin and Michael W.
Mahoney
- Abstract summary: A new class of physics-based methods related to Koopman theory has been introduced, offering an alternative for processing nonlinear dynamical systems.
We propose a novel Consistent Koopman Autoencoder model which, unlike the majority of existing work, leverages the forward and backward dynamics.
Key to our approach is a new analysis which explores the interplay between consistent dynamics and their associated Koopman operators.
- Score: 52.209416711500005
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recurrent neural networks are widely used on time series data, yet such
models often ignore the underlying physical structures in such sequences. A new
class of physics-based methods related to Koopman theory has been introduced,
offering an alternative for processing nonlinear dynamical systems. In this
work, we propose a novel Consistent Koopman Autoencoder model which, unlike the
majority of existing work, leverages the forward and backward dynamics. Key to
our approach is a new analysis which explores the interplay between consistent
dynamics and their associated Koopman operators. Our network is directly
related to the derived analysis, and its computational requirements are
comparable to other baselines. We evaluate our method on a wide range of
high-dimensional and short-term dependent problems, and it achieves accurate
estimates for significant prediction horizons, while also being robust to
noise.
Related papers
- Temporally-Consistent Koopman Autoencoders for Forecasting Dynamical Systems [42.6886113798806]
We introduce the Temporally-Consistent Koopman Autoencoder (tcKAE)
tcKAE generates accurate long-term predictions even with constrained and noisy training data.
We demonstrate tcKAE's superior performance over state-of-the-art KAE models across a variety of test cases.
arXiv Detail & Related papers (2024-03-19T00:48:25Z) - Generative Modeling of Regular and Irregular Time Series Data via Koopman VAEs [50.25683648762602]
We introduce Koopman VAE, a new generative framework that is based on a novel design for the model prior.
Inspired by Koopman theory, we represent the latent conditional prior dynamics using a linear map.
KoVAE outperforms state-of-the-art GAN and VAE methods across several challenging synthetic and real-world time series generation benchmarks.
arXiv Detail & Related papers (2023-10-04T07:14:43Z) - Koopman Invertible Autoencoder: Leveraging Forward and Backward Dynamics
for Temporal Modeling [13.38194491846739]
We propose a novel machine learning model based on Koopman operator theory, which we call Koopman Invertible Autoencoders (KIA)
KIA captures the inherent characteristic of the system by modeling both forward and backward dynamics in the infinite-dimensional Hilbert space.
This enables us to efficiently learn low-dimensional representations, resulting in more accurate predictions of long-term system behavior.
arXiv Detail & Related papers (2023-09-19T03:42:55Z) - Neural Koopman prior for data assimilation [7.875955593012905]
We use a neural network architecture to embed dynamical systems in latent spaces.
We introduce methods that enable to train such a model for long-term continuous reconstruction.
The potential for self-supervised learning is also demonstrated, as we show the promising use of trained dynamical models as priors for variational data assimilation techniques.
arXiv Detail & Related papers (2023-09-11T09:04:36Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Koopa: Learning Non-stationary Time Series Dynamics with Koopman
Predictors [85.22004745984253]
Real-world time series are characterized by intrinsic non-stationarity that poses a principal challenge for deep forecasting models.
We tackle non-stationary time series with modern Koopman theory that fundamentally considers the underlying time-variant dynamics.
We propose Koopa as a novel Koopman forecaster composed of stackable blocks that learn hierarchical dynamics.
arXiv Detail & Related papers (2023-05-30T07:40:27Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Amortized Bayesian Inference for Models of Cognition [0.1529342790344802]
Recent advances in simulation-based inference using specialized neural network architectures circumvent many previous problems of approximate Bayesian computation.
We provide a general introduction to amortized Bayesian parameter estimation and model comparison.
arXiv Detail & Related papers (2020-05-08T08:12:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.