Autoencoding for the 'Good Dictionary' of eigen pairs of the Koopman
Operator
- URL: http://arxiv.org/abs/2306.05224v1
- Date: Thu, 8 Jun 2023 14:21:01 GMT
- Title: Autoencoding for the 'Good Dictionary' of eigen pairs of the Koopman
Operator
- Authors: Neranjaka Jayarathne and Erik M. Bollt
- Abstract summary: This paper proposes using deep autoencoders, a type of deep learning technique, to perform non-linear geometric transformations on raw data before computing Koopman eigen vectors.
To handle high-dimensional time series data, Takens's time delay embedding is presented as a pre-processing technique.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Reduced order modelling relies on representing complex dynamical systems
using simplified modes, which can be achieved through Koopman operator
analysis. However, computing Koopman eigen pairs for high-dimensional
observable data can be inefficient. This paper proposes using deep
autoencoders, a type of deep learning technique, to perform non-linear
geometric transformations on raw data before computing Koopman eigen vectors.
The encoded data produced by the deep autoencoder is diffeomorphic to a
manifold of the dynamical system, and has a significantly lower dimension than
the raw data. To handle high-dimensional time series data, Takens's time delay
embedding is presented as a pre-processing technique. The paper concludes by
presenting examples of these techniques in action.
Related papers
- The Persian Rug: solving toy models of superposition using large-scale symmetries [0.0]
We present a complete mechanistic description of the algorithm learned by a minimal non-linear sparse data autoencoder in the limit of large input dimension.
Our work contributes to neural network interpretability by introducing techniques for understanding the structure of autoencoders.
arXiv Detail & Related papers (2024-10-15T22:52:45Z) - Compression of the Koopman matrix for nonlinear physical models via hierarchical clustering [0.0]
The linear characteristics of the Koopman operator are hopeful to understand the nonlinear dynamics.
In this work, we propose a method to compress the Koopman matrix using hierarchical clustering.
arXiv Detail & Related papers (2024-03-27T01:18:00Z) - HETFORMER: Heterogeneous Transformer with Sparse Attention for Long-Text
Extractive Summarization [57.798070356553936]
HETFORMER is a Transformer-based pre-trained model with multi-granularity sparse attentions for extractive summarization.
Experiments on both single- and multi-document summarization tasks show that HETFORMER achieves state-of-the-art performance in Rouge F1.
arXiv Detail & Related papers (2021-10-12T22:42:31Z) - Deep Identification of Nonlinear Systems in Koopman Form [0.0]
The present paper treats the identification of nonlinear dynamical systems using Koopman-based deep state-space encoders.
An input-affine formulation is considered for the lifted model structure and we address both full and partial state availability.
arXiv Detail & Related papers (2021-10-06T08:50:56Z) - Autoencoding Variational Autoencoder [56.05008520271406]
We study the implications of this behaviour on the learned representations and also the consequences of fixing it by introducing a notion of self consistency.
We show that encoders trained with our self-consistency approach lead to representations that are robust (insensitive) to perturbations in the input introduced by adversarial attacks.
arXiv Detail & Related papers (2020-12-07T14:16:14Z) - Category-Learning with Context-Augmented Autoencoder [63.05016513788047]
Finding an interpretable non-redundant representation of real-world data is one of the key problems in Machine Learning.
We propose a novel method of using data augmentations when training autoencoders.
We train a Variational Autoencoder in such a way, that it makes transformation outcome predictable by auxiliary network.
arXiv Detail & Related papers (2020-10-10T14:04:44Z) - Multivariate Temporal Autoencoder for Predictive Reconstruction of Deep
Sequences [0.0]
Time series sequence prediction and modelling has proven to be a challenging endeavor in real world datasets.
Two key issues are the multi-dimensionality of data and the interaction of independent dimensions forming a latent output signal.
This paper proposes a multi-branch deep neural network approach to tackling the aforementioned problems by modelling a latent state vector representation of data windows.
arXiv Detail & Related papers (2020-10-07T21:25:35Z) - Cross-Thought for Sentence Encoder Pre-training [89.32270059777025]
Cross-Thought is a novel approach to pre-training sequence encoder.
We train a Transformer-based sequence encoder over a large set of short sequences.
Experiments on question answering and textual entailment tasks demonstrate that our pre-trained encoder can outperform state-of-the-art encoders.
arXiv Detail & Related papers (2020-10-07T21:02:41Z) - Representation Learning for Sequence Data with Deep Autoencoding
Predictive Components [96.42805872177067]
We propose a self-supervised representation learning method for sequence data, based on the intuition that useful representations of sequence data should exhibit a simple structure in the latent space.
We encourage this latent structure by maximizing an estimate of predictive information of latent feature sequences, which is the mutual information between past and future windows at each time step.
We demonstrate that our method recovers the latent space of noisy dynamical systems, extracts predictive features for forecasting tasks, and improves automatic speech recognition when used to pretrain the encoder on large amounts of unlabeled data.
arXiv Detail & Related papers (2020-10-07T03:34:01Z) - Gradient Origin Networks [8.952627620898074]
This paper proposes a new type of generative model that is able to quickly learn a latent representation without an encoder.
Experiments show that the proposed method converges faster, with significantly lower reconstruction error than autoencoders, while requiring half the parameters.
arXiv Detail & Related papers (2020-07-06T15:00:11Z) - Forecasting Sequential Data using Consistent Koopman Autoencoders [52.209416711500005]
A new class of physics-based methods related to Koopman theory has been introduced, offering an alternative for processing nonlinear dynamical systems.
We propose a novel Consistent Koopman Autoencoder model which, unlike the majority of existing work, leverages the forward and backward dynamics.
Key to our approach is a new analysis which explores the interplay between consistent dynamics and their associated Koopman operators.
arXiv Detail & Related papers (2020-03-04T18:24:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.