Eigenvalue initialisation and regularisation for Koopman autoencoders
- URL: http://arxiv.org/abs/2212.12086v2
- Date: Mon, 26 Dec 2022 02:59:07 GMT
- Title: Eigenvalue initialisation and regularisation for Koopman autoencoders
- Authors: Jack W. Miller, Charles O'Neill, Navid C. Constantinou, and Omri
Azencot
- Abstract summary: We study the Koopman autoencoder model which includes an encoder, a Koopman operator layer, and a decoder.
We propose the "eigeninit" initialisation scheme that samples initial Koopman operators from specific eigenvalue distributions.
In addition, we suggest the "eigenloss" penalty scheme that penalises the eigenvalues of the Koopman operator during training.
- Score: 2.260916274164351
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Regularising the parameter matrices of neural networks is ubiquitous in
training deep models. Typical regularisation approaches suggest initialising
weights using small random values, and to penalise weights to promote sparsity.
However, these widely used techniques may be less effective in certain
scenarios. Here, we study the Koopman autoencoder model which includes an
encoder, a Koopman operator layer, and a decoder. These models have been
designed and dedicated to tackle physics-related problems with interpretable
dynamics and an ability to incorporate physics-related constraints. However,
the majority of existing work employs standard regularisation practices. In our
work, we take a step toward augmenting Koopman autoencoders with initialisation
and penalty schemes tailored for physics-related settings. Specifically, we
propose the "eigeninit" initialisation scheme that samples initial Koopman
operators from specific eigenvalue distributions. In addition, we suggest the
"eigenloss" penalty scheme that penalises the eigenvalues of the Koopman
operator during training. We demonstrate the utility of these schemes on two
synthetic data sets: a driven pendulum and flow past a cylinder; and two
real-world problems: ocean surface temperatures and cyclone wind fields. We
find on these datasets that eigenloss and eigeninit improves the convergence
rate by up to a factor of 5, and that they reduce the cumulative long-term
prediction error by up to a factor of 3. Such a finding points to the utility
of incorporating similar schemes as an inductive bias in other physics-related
deep learning approaches.
Related papers
- Generative Modeling of Regular and Irregular Time Series Data via Koopman VAEs [50.25683648762602]
We introduce Koopman VAE, a new generative framework that is based on a novel design for the model prior.
Inspired by Koopman theory, we represent the latent conditional prior dynamics using a linear map.
KoVAE outperforms state-of-the-art GAN and VAE methods across several challenging synthetic and real-world time series generation benchmarks.
arXiv Detail & Related papers (2023-10-04T07:14:43Z) - Koopa: Learning Non-stationary Time Series Dynamics with Koopman
Predictors [85.22004745984253]
Real-world time series are characterized by intrinsic non-stationarity that poses a principal challenge for deep forecasting models.
We tackle non-stationary time series with modern Koopman theory that fundamentally considers the underlying time-variant dynamics.
We propose Koopa as a novel Koopman forecaster composed of stackable blocks that learn hierarchical dynamics.
arXiv Detail & Related papers (2023-05-30T07:40:27Z) - Disentanglement via Latent Quantization [60.37109712033694]
In this work, we construct an inductive bias towards encoding to and decoding from an organized latent space.
We demonstrate the broad applicability of this approach by adding it to both basic data-re (vanilla autoencoder) and latent-reconstructing (InfoGAN) generative models.
arXiv Detail & Related papers (2023-05-28T06:30:29Z) - Physics-Informed Koopman Network [14.203407036091555]
We propose a novel architecture inspired by physics-informed neural networks to represent Koopman operators.
We demonstrate that it not only reduces the need of large training data-sets, but also maintains high effectiveness in approximating Koopman eigenfunctions.
arXiv Detail & Related papers (2022-11-17T08:57:57Z) - Bias-Variance Tradeoffs in Single-Sample Binary Gradient Estimators [100.58924375509659]
Straight-through (ST) estimator gained popularity due to its simplicity and efficiency.
Several techniques were proposed to improve over ST while keeping the same low computational complexity.
We conduct a theoretical analysis of Bias and Variance of these methods in order to understand tradeoffs and verify originally claimed properties.
arXiv Detail & Related papers (2021-10-07T15:16:07Z) - Deep Identification of Nonlinear Systems in Koopman Form [0.0]
The present paper treats the identification of nonlinear dynamical systems using Koopman-based deep state-space encoders.
An input-affine formulation is considered for the lifted model structure and we address both full and partial state availability.
arXiv Detail & Related papers (2021-10-06T08:50:56Z) - Stochastic Adversarial Koopman Model for Dynamical Systems [0.4061135251278187]
This paper extends a recently developed adversarial Koopman model to space, where the Koopman applies on the probability of the latent encoding of an encoder.
The efficacy of the Koopman model is demonstrated on different test problems in chaos, fluid dynamics, combustion, and reaction-diffusion models.
arXiv Detail & Related papers (2021-09-10T20:17:44Z) - Autoencoding Variational Autoencoder [56.05008520271406]
We study the implications of this behaviour on the learned representations and also the consequences of fixing it by introducing a notion of self consistency.
We show that encoders trained with our self-consistency approach lead to representations that are robust (insensitive) to perturbations in the input introduced by adversarial attacks.
arXiv Detail & Related papers (2020-12-07T14:16:14Z) - Forecasting Sequential Data using Consistent Koopman Autoencoders [52.209416711500005]
A new class of physics-based methods related to Koopman theory has been introduced, offering an alternative for processing nonlinear dynamical systems.
We propose a novel Consistent Koopman Autoencoder model which, unlike the majority of existing work, leverages the forward and backward dynamics.
Key to our approach is a new analysis which explores the interplay between consistent dynamics and their associated Koopman operators.
arXiv Detail & Related papers (2020-03-04T18:24:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.