Learning the Koopman Eigendecomposition: A Diffeomorphic Approach
- URL: http://arxiv.org/abs/2110.07786v1
- Date: Fri, 15 Oct 2021 00:47:21 GMT
- Title: Learning the Koopman Eigendecomposition: A Diffeomorphic Approach
- Authors: Petar Bevanda, Johannes Kirmayr, Stefan Sosnowski, Sandra Hirche
- Abstract summary: We present a novel data-driven approach for learning linear representations of a class of stable nonlinear systems using Koopman eigenfunctions.
To our best knowledge, this is the first work to close the gap between the operator, system and learning theories.
- Score: 7.309026600178573
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a novel data-driven approach for learning linear representations
of a class of stable nonlinear systems using Koopman eigenfunctions. By
learning the conjugacy map between a nonlinear system and its Jacobian
linearization through a Normalizing Flow one can guarantee the learned function
is a diffeomorphism. Using this diffeomorphism, we construct eigenfunctions of
the nonlinear system via the spectral equivalence of conjugate systems -
allowing the construction of linear predictors for nonlinear systems. The
universality of the diffeomorphism learner leads to the universal approximation
of the nonlinear system's Koopman eigenfunctions. The developed method is also
safe as it guarantees the model is asymptotically stable regardless of the
representation accuracy. To our best knowledge, this is the first work to close
the gap between the operator, system and learning theories. The efficacy of our
approach is shown through simulation examples.
Related papers
- Kernel Operator-Theoretic Bayesian Filter for Nonlinear Dynamical Systems [25.922732994397485]
We propose a machine-learning alternative based on a functional Bayesian perspective for operator-theoretic modeling.
This formulation is directly done in an infinite-dimensional space of linear operators or Hilbert space with universal approximation property.
We demonstrate that this practical approach can obtain accurate results and outperform finite-dimensional Koopman decomposition.
arXiv Detail & Related papers (2024-10-31T20:31:31Z) - Koopman-based Deep Learning for Nonlinear System Estimation [1.3791394805787949]
We present a novel data-driven linear estimator based on Koopman operator theory to extract meaningful finite-dimensional representations of complex non-linear systems.
Our estimator is also adaptive to a diffeomorphic transformation of the estimated nonlinear system, which enables it to compute optimal state estimates without re-learning.
arXiv Detail & Related papers (2024-05-01T16:49:54Z) - Deep Learning for Structure-Preserving Universal Stable Koopman-Inspired
Embeddings for Nonlinear Canonical Hamiltonian Dynamics [9.599029891108229]
We focus on the identification of global linearized embeddings for canonical nonlinear Hamiltonian systems through a symplectic transformation.
To overcome the shortcomings of Koopman operators for systems with continuous spectra, we apply the lifting principle and learn global cubicized embeddings.
We demonstrate the capabilities of deep learning in acquiring compact symplectic coordinate transformation and the corresponding simple dynamical models.
arXiv Detail & Related papers (2023-08-26T09:58:09Z) - Identifiability and Asymptotics in Learning Homogeneous Linear ODE Systems from Discrete Observations [114.17826109037048]
Ordinary Differential Equations (ODEs) have recently gained a lot of attention in machine learning.
theoretical aspects, e.g., identifiability and properties of statistical estimation are still obscure.
This paper derives a sufficient condition for the identifiability of homogeneous linear ODE systems from a sequence of equally-spaced error-free observations sampled from a single trajectory.
arXiv Detail & Related papers (2022-10-12T06:46:38Z) - Exploring Linear Feature Disentanglement For Neural Networks [63.20827189693117]
Non-linear activation functions, e.g., Sigmoid, ReLU, and Tanh, have achieved great success in neural networks (NNs)
Due to the complex non-linear characteristic of samples, the objective of those activation functions is to project samples from their original feature space to a linear separable feature space.
This phenomenon ignites our interest in exploring whether all features need to be transformed by all non-linear functions in current typical NNs.
arXiv Detail & Related papers (2022-03-22T13:09:17Z) - KoopmanizingFlows: Diffeomorphically Learning Stable Koopman Operators [7.447933533434023]
We propose a novel framework for constructing linear time-invariant (LTI) models for a class of stable nonlinear dynamics.
We learn the Koopman operator features without assuming a predefined library of functions or knowing the spectrum.
We demonstrate the superior efficacy of the proposed method in comparison to a state-of-the-art method on the well-known LASA handwriting dataset.
arXiv Detail & Related papers (2021-12-08T02:40:40Z) - Learning Stable Koopman Embeddings [9.239657838690228]
We present a new data-driven method for learning stable models of nonlinear systems.
We prove that every discrete-time nonlinear contracting model can be learnt in our framework.
arXiv Detail & Related papers (2021-10-13T05:44:13Z) - Supervised DKRC with Images for Offline System Identification [77.34726150561087]
Modern dynamical systems are becoming increasingly non-linear and complex.
There is a need for a framework to model these systems in a compact and comprehensive representation for prediction and control.
Our approach learns these basis functions using a supervised learning approach.
arXiv Detail & Related papers (2021-09-06T04:39:06Z) - Estimating Koopman operators for nonlinear dynamical systems: a
nonparametric approach [77.77696851397539]
The Koopman operator is a mathematical tool that allows for a linear description of non-linear systems.
In this paper we capture their core essence as a dual version of the same framework, incorporating them into the Kernel framework.
We establish a strong link between kernel methods and Koopman operators, leading to the estimation of the latter through Kernel functions.
arXiv Detail & Related papers (2021-03-25T11:08:26Z) - Non-parametric Models for Non-negative Functions [48.7576911714538]
We provide the first model for non-negative functions from the same good linear models.
We prove that it admits a representer theorem and provide an efficient dual formulation for convex problems.
arXiv Detail & Related papers (2020-07-08T07:17:28Z) - Learning Bijective Feature Maps for Linear ICA [73.85904548374575]
We show that existing probabilistic deep generative models (DGMs) which are tailor-made for image data, underperform on non-linear ICA tasks.
To address this, we propose a DGM which combines bijective feature maps with a linear ICA model to learn interpretable latent structures for high-dimensional data.
We create models that converge quickly, are easy to train, and achieve better unsupervised latent factor discovery than flow-based models, linear ICA, and Variational Autoencoders on images.
arXiv Detail & Related papers (2020-02-18T17:58:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.