Deep Identification of Nonlinear Systems in Koopman Form
- URL: http://arxiv.org/abs/2110.02583v1
- Date: Wed, 6 Oct 2021 08:50:56 GMT
- Title: Deep Identification of Nonlinear Systems in Koopman Form
- Authors: Lucian Cristian Iacob, Gerben Izaak Beintema, Maarten Schoukens and
Roland T\'oth
- Abstract summary: The present paper treats the identification of nonlinear dynamical systems using Koopman-based deep state-space encoders.
An input-affine formulation is considered for the lifted model structure and we address both full and partial state availability.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: The present paper treats the identification of nonlinear dynamical systems
using Koopman-based deep state-space encoders. Through this method, the usual
drawback of needing to choose a dictionary of lifting functions a priori is
circumvented. The encoder represents the lifting function to the space where
the dynamics are linearly propagated using the Koopman operator. An
input-affine formulation is considered for the lifted model structure and we
address both full and partial state availability. The approach is implemented
using the the deepSI toolbox in Python. To lower the computational need of the
simulation error-based training, the data is split into subsections where
multi-step prediction errors are calculated independently. This formulation
allows for efficient batch optimization of the network parameters and, at the
same time, excellent long term prediction capabilities of the obtained models.
The performance of the approach is illustrated by nonlinear benchmark examples.
Related papers
- Koopman-based Deep Learning for Nonlinear System Estimation [1.3791394805787949]
We present a novel data-driven linear estimator based on Koopman operator theory to extract meaningful finite-dimensional representations of complex non-linear systems.
Our estimator is also adaptive to a diffeomorphic transformation of the estimated nonlinear system, which enables it to compute optimal state estimates without re-learning.
arXiv Detail & Related papers (2024-05-01T16:49:54Z) - Compression of the Koopman matrix for nonlinear physical models via hierarchical clustering [0.0]
The linear characteristics of the Koopman operator are hopeful to understand the nonlinear dynamics.
In this work, we propose a method to compress the Koopman matrix using hierarchical clustering.
arXiv Detail & Related papers (2024-03-27T01:18:00Z) - Generalizing Backpropagation for Gradient-Based Interpretability [103.2998254573497]
We show that the gradient of a model is a special case of a more general formulation using semirings.
This observation allows us to generalize the backpropagation algorithm to efficiently compute other interpretable statistics.
arXiv Detail & Related papers (2023-07-06T15:19:53Z) - Kalman Filter for Online Classification of Non-Stationary Data [101.26838049872651]
In Online Continual Learning (OCL) a learning system receives a stream of data and sequentially performs prediction and training steps.
We introduce a probabilistic Bayesian online learning model by using a neural representation and a state space model over the linear predictor weights.
In experiments in multi-class classification we demonstrate the predictive ability of the model and its flexibility to capture non-stationarity.
arXiv Detail & Related papers (2023-06-14T11:41:42Z) - Autoencoding for the 'Good Dictionary' of eigen pairs of the Koopman
Operator [0.0]
This paper proposes using deep autoencoders, a type of deep learning technique, to perform non-linear geometric transformations on raw data before computing Koopman eigen vectors.
To handle high-dimensional time series data, Takens's time delay embedding is presented as a pre-processing technique.
arXiv Detail & Related papers (2023-06-08T14:21:01Z) - Low-rank extended Kalman filtering for online learning of neural
networks from streaming data [71.97861600347959]
We propose an efficient online approximate Bayesian inference algorithm for estimating the parameters of a nonlinear function from a potentially non-stationary data stream.
The method is based on the extended Kalman filter (EKF), but uses a novel low-rank plus diagonal decomposition of the posterior matrix.
In contrast to methods based on variational inference, our method is fully deterministic, and does not require step-size tuning.
arXiv Detail & Related papers (2023-05-31T03:48:49Z) - Efficient Semi-Implicit Variational Inference [65.07058307271329]
We propose an efficient and scalable semi-implicit extrapolational (SIVI)
Our method maps SIVI's evidence to a rigorous inference of lower gradient values.
arXiv Detail & Related papers (2021-01-15T11:39:09Z) - Autoencoding Variational Autoencoder [56.05008520271406]
We study the implications of this behaviour on the learned representations and also the consequences of fixing it by introducing a notion of self consistency.
We show that encoders trained with our self-consistency approach lead to representations that are robust (insensitive) to perturbations in the input introduced by adversarial attacks.
arXiv Detail & Related papers (2020-12-07T14:16:14Z) - Derivative-Based Koopman Operators for Real-Time Control of Robotic
Systems [14.211417879279075]
This paper presents a generalizable methodology for data-driven identification of nonlinear dynamics that bounds the model error.
We construct a Koopman operator-based linear representation and utilize Taylor series accuracy analysis to derive an error bound.
When combined with control, the Koopman representation of the nonlinear system has marginally better performance than competing nonlinear modeling methods.
arXiv Detail & Related papers (2020-10-12T15:15:13Z) - Relative gradient optimization of the Jacobian term in unsupervised deep
learning [9.385902422987677]
Learning expressive probabilistic models correctly describing the data is a ubiquitous problem in machine learning.
Deep density models have been widely used for this task, but their maximum likelihood based training requires estimating the log-determinant of the Jacobian.
We propose a new approach for exact training of such neural networks.
arXiv Detail & Related papers (2020-06-26T16:41:08Z) - Forecasting Sequential Data using Consistent Koopman Autoencoders [52.209416711500005]
A new class of physics-based methods related to Koopman theory has been introduced, offering an alternative for processing nonlinear dynamical systems.
We propose a novel Consistent Koopman Autoencoder model which, unlike the majority of existing work, leverages the forward and backward dynamics.
Key to our approach is a new analysis which explores the interplay between consistent dynamics and their associated Koopman operators.
arXiv Detail & Related papers (2020-03-04T18:24:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.