Learning Dynamical Systems via Koopman Operator Regression in
Reproducing Kernel Hilbert Spaces
- URL: http://arxiv.org/abs/2205.14027v1
- Date: Fri, 27 May 2022 14:57:48 GMT
- Title: Learning Dynamical Systems via Koopman Operator Regression in
Reproducing Kernel Hilbert Spaces
- Authors: Vladimir Kostic, Pietro Novelli, Andreas Maurer, Carlo Ciliberto,
Lorenzo Rosasco, Massimiliano Pontil
- Abstract summary: We formalize a framework to learn the Koopman operator from finite data trajectories of the dynamical system.
We link the risk with the estimation of the spectral decomposition of the Koopman operator.
Our results suggest RRR might be beneficial over other widely used estimators.
- Score: 52.35063796758121
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study a class of dynamical systems modelled as Markov chains that admit an
invariant distribution via the corresponding transfer, or Koopman, operator.
While data-driven algorithms to reconstruct such operators are well known,
their relationship with statistical learning is largely unexplored. We
formalize a framework to learn the Koopman operator from finite data
trajectories of the dynamical system. We consider the restriction of this
operator to a reproducing kernel Hilbert space and introduce a notion of risk,
from which different estimators naturally arise. We link the risk with the
estimation of the spectral decomposition of the Koopman operator. These
observations motivate a reduced-rank operator regression (RRR) estimator. We
derive learning bounds for the proposed estimator, holding both in i.i.d. and
non i.i.d. settings, the latter in terms of mixing coefficients. Our results
suggest RRR might be beneficial over other widely used estimators as confirmed
in numerical experiments both for forecasting and mode decomposition.
Related papers
- Koopman operators with intrinsic observables in rigged reproducing kernel Hilbert spaces [16.00267662259167]
This paper presents a novel approach for estimating the Koopman operator defined on a reproducing kernel Hilbert space (RKHS) and its spectra.
We propose an estimation method, what we call Jet Dynamic Mode Decomposition (JetDMD), leveraging the intrinsic structure of RKHS and the geometric notion known as jets.
This method refines the traditional Extended Dynamic Mode Decomposition (EDMD) in accuracy, especially in the numerical estimation of eigenvalues.
arXiv Detail & Related papers (2024-03-04T22:28:20Z) - Neural Operators with Localized Integral and Differential Kernels [77.76991758980003]
We present a principled approach to operator learning that can capture local features under two frameworks.
We prove that we obtain differential operators under an appropriate scaling of the kernel values of CNNs.
To obtain local integral operators, we utilize suitable basis representations for the kernels based on discrete-continuous convolutions.
arXiv Detail & Related papers (2024-02-26T18:59:31Z) - Provable Guarantees for Generative Behavior Cloning: Bridging Low-Level
Stability and High-Level Behavior [51.60683890503293]
We propose a theoretical framework for studying behavior cloning of complex expert demonstrations using generative modeling.
We show that pure supervised cloning can generate trajectories matching the per-time step distribution of arbitrary expert trajectories.
arXiv Detail & Related papers (2023-07-27T04:27:26Z) - Estimating Koopman operators with sketching to provably learn large
scale dynamical systems [37.18243295790146]
The theory of Koopman operators allows to deploy non-parametric machine learning algorithms to predict and analyze complex dynamical systems.
We boost the efficiency of different kernel-based Koopman operator estimators using random projections.
We establish non error bounds giving a sharp characterization of the trade-offs between statistical learning rates and computational efficiency.
arXiv Detail & Related papers (2023-06-07T15:30:03Z) - Understanding Augmentation-based Self-Supervised Representation Learning
via RKHS Approximation and Regression [53.15502562048627]
Recent work has built the connection between self-supervised learning and the approximation of the top eigenspace of a graph Laplacian operator.
This work delves into a statistical analysis of augmentation-based pretraining.
arXiv Detail & Related papers (2023-06-01T15:18:55Z) - Koopman Kernel Regression [6.116741319526748]
We show that Koopman operator theory offers a beneficial paradigm for characterizing forecasts via linear time-invariant (LTI) ODEs.
We derive a universal Koopman-invariant kernel reproducing Hilbert space (RKHS) that solely spans transformations into LTI dynamical systems.
Our experiments demonstrate superior forecasting performance compared to Koopman operator and sequential data predictors.
arXiv Detail & Related papers (2023-05-25T16:22:22Z) - Sharp Spectral Rates for Koopman Operator Learning [27.820383937933034]
We present for the first time non-asymptotic learning bounds for the Koopman eigenvalues and eigenfunctions.
Our results shed new light on the emergence of spurious eigenvalues.
arXiv Detail & Related papers (2023-02-03T21:19:56Z) - Marginalized Operators for Off-policy Reinforcement Learning [53.37381513736073]
Marginalized operators strictly generalize generic multi-step operators, such as Retrace, as special cases.
We show that the estimates for marginalized operators can be computed in a scalable way, which also generalizes prior results on marginalized importance sampling as special cases.
arXiv Detail & Related papers (2022-03-30T09:59:59Z) - Convergence Rates for Learning Linear Operators from Noisy Data [6.4423565043274795]
We study the inverse problem of learning a linear operator on a space from its noisy pointwise evaluations on random input data.
We establish posterior contraction rates with respect to a family of Bochner norms as the number of data tend to infinity lower on the estimation error.
These convergence rates highlight and quantify the difficulty of learning linear operators in comparison with the learning of bounded or compact ones.
arXiv Detail & Related papers (2021-08-27T22:09:53Z) - Robust Compressed Sensing using Generative Models [98.64228459705859]
In this paper we propose an algorithm inspired by the Median-of-Means (MOM)
Our algorithm guarantees recovery for heavy-tailed data, even in the presence of outliers.
arXiv Detail & Related papers (2020-06-16T19:07:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.