Koopman Kernel Regression
- URL: http://arxiv.org/abs/2305.16215v3
- Date: Tue, 16 Jan 2024 15:02:57 GMT
- Title: Koopman Kernel Regression
- Authors: Petar Bevanda, Max Beier, Armin Lederer, Stefan Sosnowski, Eyke
H\"ullermeier, Sandra Hirche
- Abstract summary: We show that Koopman operator theory offers a beneficial paradigm for characterizing forecasts via linear time-invariant (LTI) ODEs.
We derive a universal Koopman-invariant kernel reproducing Hilbert space (RKHS) that solely spans transformations into LTI dynamical systems.
Our experiments demonstrate superior forecasting performance compared to Koopman operator and sequential data predictors.
- Score: 6.116741319526748
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Many machine learning approaches for decision making, such as reinforcement
learning, rely on simulators or predictive models to forecast the
time-evolution of quantities of interest, e.g., the state of an agent or the
reward of a policy. Forecasts of such complex phenomena are commonly described
by highly nonlinear dynamical systems, making their use in optimization-based
decision-making challenging. Koopman operator theory offers a beneficial
paradigm for addressing this problem by characterizing forecasts via linear
time-invariant (LTI) ODEs, turning multi-step forecasts into sparse matrix
multiplication. Though there exists a variety of learning approaches, they
usually lack crucial learning-theoretic guarantees, making the behavior of the
obtained models with increasing data and dimensionality unclear. We address the
aforementioned by deriving a universal Koopman-invariant reproducing kernel
Hilbert space (RKHS) that solely spans transformations into LTI dynamical
systems. The resulting Koopman Kernel Regression (KKR) framework enables the
use of statistical learning tools from function approximation for novel
convergence results and generalization error bounds under weaker assumptions
than existing work. Our experiments demonstrate superior forecasting
performance compared to Koopman operator and sequential data predictors in
RKHS.
Related papers
- Koopman-Equivariant Gaussian Processes [39.34668284375732]
We propose a family of Gaussian processes (GP) for dynamical systems with linear time-invariant responses.
This linearity allows us to tractably quantify forecasting and representational uncertainty.
Experiments demonstrate on-par and often better forecasting performance compared to kernel-based methods for learning dynamical systems.
arXiv Detail & Related papers (2025-02-10T16:35:08Z) - Koopman Theory-Inspired Method for Learning Time Advancement Operators in Unstable Flame Front Evolution [0.2812395851874055]
This study introduces Koopman-inspired Fourier Neural Operators (kFNO) and Convolutional Neural Networks (kCNN) to learn solution advancement operators for flame front instabilities.
By transforming data into a high-dimensional latent space, these models achieve more accurate multi-step predictions compared to traditional methods.
arXiv Detail & Related papers (2024-12-11T14:47:19Z) - Koopman Invertible Autoencoder: Leveraging Forward and Backward Dynamics
for Temporal Modeling [13.38194491846739]
We propose a novel machine learning model based on Koopman operator theory, which we call Koopman Invertible Autoencoders (KIA)
KIA captures the inherent characteristic of the system by modeling both forward and backward dynamics in the infinite-dimensional Hilbert space.
This enables us to efficiently learn low-dimensional representations, resulting in more accurate predictions of long-term system behavior.
arXiv Detail & Related papers (2023-09-19T03:42:55Z) - Structured Radial Basis Function Network: Modelling Diversity for
Multiple Hypotheses Prediction [51.82628081279621]
Multi-modal regression is important in forecasting nonstationary processes or with a complex mixture of distributions.
A Structured Radial Basis Function Network is presented as an ensemble of multiple hypotheses predictors for regression problems.
It is proved that this structured model can efficiently interpolate this tessellation and approximate the multiple hypotheses target distribution.
arXiv Detail & Related papers (2023-09-02T01:27:53Z) - Estimating Koopman operators with sketching to provably learn large
scale dynamical systems [37.18243295790146]
The theory of Koopman operators allows to deploy non-parametric machine learning algorithms to predict and analyze complex dynamical systems.
We boost the efficiency of different kernel-based Koopman operator estimators using random projections.
We establish non error bounds giving a sharp characterization of the trade-offs between statistical learning rates and computational efficiency.
arXiv Detail & Related papers (2023-06-07T15:30:03Z) - Understanding Augmentation-based Self-Supervised Representation Learning
via RKHS Approximation and Regression [53.15502562048627]
Recent work has built the connection between self-supervised learning and the approximation of the top eigenspace of a graph Laplacian operator.
This work delves into a statistical analysis of augmentation-based pretraining.
arXiv Detail & Related papers (2023-06-01T15:18:55Z) - Sharp Spectral Rates for Koopman Operator Learning [27.820383937933034]
We present for the first time non-asymptotic learning bounds for the Koopman eigenvalues and eigenfunctions.
Our results shed new light on the emergence of spurious eigenvalues.
arXiv Detail & Related papers (2023-02-03T21:19:56Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - CASTLE: Regularization via Auxiliary Causal Graph Discovery [89.74800176981842]
We introduce Causal Structure Learning (CASTLE) regularization and propose to regularize a neural network by jointly learning the causal relationships between variables.
CASTLE efficiently reconstructs only the features in the causal DAG that have a causal neighbor, whereas reconstruction-based regularizers suboptimally reconstruct all input features.
arXiv Detail & Related papers (2020-09-28T09:49:38Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z) - Forecasting Sequential Data using Consistent Koopman Autoencoders [52.209416711500005]
A new class of physics-based methods related to Koopman theory has been introduced, offering an alternative for processing nonlinear dynamical systems.
We propose a novel Consistent Koopman Autoencoder model which, unlike the majority of existing work, leverages the forward and backward dynamics.
Key to our approach is a new analysis which explores the interplay between consistent dynamics and their associated Koopman operators.
arXiv Detail & Related papers (2020-03-04T18:24:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.