Kernel Operator-Theoretic Bayesian Filter for Nonlinear Dynamical Systems
- URL: http://arxiv.org/abs/2411.00198v1
- Date: Thu, 31 Oct 2024 20:31:31 GMT
- Title: Kernel Operator-Theoretic Bayesian Filter for Nonlinear Dynamical Systems
- Authors: Kan Li, José C. PrÃncipe,
- Abstract summary: We propose a machine-learning alternative based on a functional Bayesian perspective for operator-theoretic modeling.
This formulation is directly done in an infinite-dimensional space of linear operators or Hilbert space with universal approximation property.
We demonstrate that this practical approach can obtain accurate results and outperform finite-dimensional Koopman decomposition.
- Score: 25.922732994397485
- License:
- Abstract: Motivated by the surge of interest in Koopman operator theory, we propose a machine-learning alternative based on a functional Bayesian perspective for operator-theoretic modeling of unknown, data-driven, nonlinear dynamical systems. This formulation is directly done in an infinite-dimensional space of linear operators or Hilbert space with universal approximation property. The theory of reproducing kernel Hilbert space (RKHS) allows the lifting of nonlinear dynamics to a potentially infinite-dimensional space via linear embeddings, where a general nonlinear function is represented as a set of linear functions or operators in the functional space. This allows us to apply classical linear Bayesian methods such as the Kalman filter directly in the Hilbert space, yielding nonlinear solutions in the original input space. This kernel perspective on the Koopman operator offers two compelling advantages. First, the Hilbert space can be constructed deterministically, agnostic to the nonlinear dynamics. The Gaussian kernel is universal, approximating uniformly an arbitrary continuous target function over any compact domain. Second, Bayesian filter is an adaptive, linear minimum-variance algorithm, allowing the system to update the Koopman operator and continuously track the changes across an extended period of time, ideally suited for modern data-driven applications such as real-time machine learning using streaming data. In this paper, we present several practical implementations to obtain a finite-dimensional approximation of the functional Bayesian filter (FBF). Due to the rapid decay of the Gaussian kernel, excellent approximation is obtained with a small dimension. We demonstrate that this practical approach can obtain accurate results and outperform finite-dimensional Koopman decomposition.
Related papers
- Highly Adaptive Ridge [84.38107748875144]
We propose a regression method that achieves a $n-2/3$ dimension-free L2 convergence rate in the class of right-continuous functions with square-integrable sectional derivatives.
Har is exactly kernel ridge regression with a specific data-adaptive kernel based on a saturated zero-order tensor-product spline basis expansion.
We demonstrate empirical performance better than state-of-the-art algorithms for small datasets in particular.
arXiv Detail & Related papers (2024-10-03T17:06:06Z) - Uncertainty Modelling and Robust Observer Synthesis using the Koopman Operator [5.317624228510749]
The Koopman operator allows nonlinear systems to be rewritten as infinite-dimensional linear systems.
A finite-dimensional approximation of the Koopman operator can be identified directly from data.
A population of several dozen motor drives is used to experimentally demonstrate the proposed method.
arXiv Detail & Related papers (2024-10-01T20:31:18Z) - Learning with Norm Constrained, Over-parameterized, Two-layer Neural Networks [54.177130905659155]
Recent studies show that a reproducing kernel Hilbert space (RKHS) is not a suitable space to model functions by neural networks.
In this paper, we study a suitable function space for over- parameterized two-layer neural networks with bounded norms.
arXiv Detail & Related papers (2024-04-29T15:04:07Z) - Promises and Pitfalls of the Linearized Laplace in Bayesian Optimization [73.80101701431103]
The linearized-Laplace approximation (LLA) has been shown to be effective and efficient in constructing Bayesian neural networks.
We study the usefulness of the LLA in Bayesian optimization and highlight its strong performance and flexibility.
arXiv Detail & Related papers (2023-04-17T14:23:43Z) - Kernel-based off-policy estimation without overlap: Instance optimality
beyond semiparametric efficiency [53.90687548731265]
We study optimal procedures for estimating a linear functional based on observational data.
For any convex and symmetric function class $mathcalF$, we derive a non-asymptotic local minimax bound on the mean-squared error.
arXiv Detail & Related papers (2023-01-16T02:57:37Z) - Minimax Optimal Kernel Operator Learning via Multilevel Training [11.36492861074981]
We study the statistical limit of learning a Hilbert-Schmidt operator between two infinite-dimensional Sobolev reproducing kernel Hilbert spaces.
We develop a multilevel kernel operator learning algorithm that is optimal when learning linear operators between infinite-dimensional function spaces.
arXiv Detail & Related papers (2022-09-28T21:31:43Z) - Experimental Design for Linear Functionals in Reproducing Kernel Hilbert
Spaces [102.08678737900541]
We provide algorithms for constructing bias-aware designs for linear functionals.
We derive non-asymptotic confidence sets for fixed and adaptive designs under sub-Gaussian noise.
arXiv Detail & Related papers (2022-05-26T20:56:25Z) - Adjoint-aided inference of Gaussian process driven differential
equations [0.8257490175399691]
We show how the adjoint of a linear system can be used to efficiently infer forcing functions modelled as GPs.
We demonstrate the approach on systems of both ordinary and partial differential equations.
arXiv Detail & Related papers (2022-02-09T17:35:14Z) - Learning the Koopman Eigendecomposition: A Diffeomorphic Approach [7.309026600178573]
We present a novel data-driven approach for learning linear representations of a class of stable nonlinear systems using Koopman eigenfunctions.
To our best knowledge, this is the first work to close the gap between the operator, system and learning theories.
arXiv Detail & Related papers (2021-10-15T00:47:21Z) - Estimating Koopman operators for nonlinear dynamical systems: a
nonparametric approach [77.77696851397539]
The Koopman operator is a mathematical tool that allows for a linear description of non-linear systems.
In this paper we capture their core essence as a dual version of the same framework, incorporating them into the Kernel framework.
We establish a strong link between kernel methods and Koopman operators, leading to the estimation of the latter through Kernel functions.
arXiv Detail & Related papers (2021-03-25T11:08:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.