Koopman Methods for Estimation of Animal Motions over Unknown, Regularly
Embedded Submanifolds
- URL: http://arxiv.org/abs/2203.05646v1
- Date: Thu, 10 Mar 2022 21:20:19 GMT
- Title: Koopman Methods for Estimation of Animal Motions over Unknown, Regularly
Embedded Submanifolds
- Authors: Nathan Powell, Bowei Liu, and Andrew J. Kurdila
- Abstract summary: This paper introduces a method to estimate forward kinematics from the unknown configuration submanifold $Q$ to an $n$-dimensional Euclidean space $Y:=mathbbRn$ of observations.
We show that the derived rates of convergence can be applied to estimates generated by the extended dynamic mode decomposition (EDMD) method.
- Score: 0.9558392439655015
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper introduces a data-dependent approximation of the forward
kinematics map for certain types of animal motion models. It is assumed that
motions are supported on a low-dimensional, unknown configuration manifold $Q$
that is regularly embedded in high dimensional Euclidean space
$X:=\mathbb{R}^d$. This paper introduces a method to estimate forward
kinematics from the unknown configuration submanifold $Q$ to an $n$-dimensional
Euclidean space $Y:=\mathbb{R}^n$ of observations. A known reproducing kernel
Hilbert space (RKHS) is defined over the ambient space $X$ in terms of a known
kernel function, and computations are performed using the known kernel defined
on the ambient space $X$. Estimates are constructed using a certain
data-dependent approximation of the Koopman operator defined in terms of the
known kernel on $X$. However, the rate of convergence of approximations is
studied in the space of restrictions to the unknown manifold $Q$. Strong rates
of convergence are derived in terms of the fill distance of samples in the
unknown configuration manifold, provided that a novel regularity result holds
for the Koopman operator. Additionally, we show that the derived rates of
convergence can be applied in some cases to estimates generated by the extended
dynamic mode decomposition (EDMD) method. We illustrate characteristics of the
estimates for simulated data as well as samples collected during motion capture
experiments.
Related papers
- Riemannian Langevin Dynamics: Strong Convergence of Geometric Euler-Maruyama Scheme [51.56484100374058]
Low-dimensional structure in real-world data plays an important role in the success of generative models.<n>We prove convergence theory of numerical schemes for manifold-valued differential equations.
arXiv Detail & Related papers (2026-03-04T01:29:35Z) - Second quantization for classical nonlinear dynamics [0.0]
We propose a framework for representing the evolution of observables of measure-preserving ergodic flows through infinite-dimensional rotation systems on tori.
We show that their Banach algebra spectra, $sigma(F_w(mathcal H_tau)$, decompose into a family of tori of potentially infinite dimension.
Our scheme also employs a procedure for representing observables of the original system by reproducing functions on finite-dimensional tori in $sigma(F_w(mathcal H_tau)$ of arbitrarily large degree.
arXiv Detail & Related papers (2025-01-13T15:36:53Z) - Tensor network approximation of Koopman operators [0.0]
We propose a framework for approximating the evolution of observables of measure-preserving ergodic systems.
Our approach is based on a spectrally-convergent approximation of the skew-adjoint Koopman generator.
A key feature of this quantum-inspired approximation is that it captures information from a tensor product space of dimension $(2d+1)n$.
arXiv Detail & Related papers (2024-07-09T21:40:14Z) - Learning with Norm Constrained, Over-parameterized, Two-layer Neural Networks [54.177130905659155]
Recent studies show that a reproducing kernel Hilbert space (RKHS) is not a suitable space to model functions by neural networks.
In this paper, we study a suitable function space for over- parameterized two-layer neural networks with bounded norms.
arXiv Detail & Related papers (2024-04-29T15:04:07Z) - Sampling and estimation on manifolds using the Langevin diffusion [45.57801520690309]
Two estimators of linear functionals of $mu_phi $ based on the discretized Markov process are considered.
Error bounds are derived for sampling and estimation using a discretization of an intrinsically defined Langevin diffusion.
arXiv Detail & Related papers (2023-12-22T18:01:11Z) - Data-driven discovery with Limited Data Acquisition for fluid flow
across cylinder [0.0]
We use a variant of Kernelized Extended DMD (KeDMD) based on the Koopman operator to recover the dominant Koopman modes for the standard fluid flow across cylinder experiment.
It turns out that the traditional kernel function, Gaussian Radial Basis Function Kernel, is not able to generate the desired Koopman modes in the scenario of executing KeDMD with limited data acquisition.
The Laplacian Kernel Function successfully generates the desired Koopman modes when limited data is provided in terms of data-set snapshot.
arXiv Detail & Related papers (2023-12-19T22:20:07Z) - Conformal inference for regression on Riemannian Manifolds [49.7719149179179]
We investigate prediction sets for regression scenarios when the response variable, denoted by $Y$, resides in a manifold, and the covariable, denoted by X, lies in Euclidean space.
We prove the almost sure convergence of the empirical version of these regions on the manifold to their population counterparts.
arXiv Detail & Related papers (2023-10-12T10:56:25Z) - The mpEDMD Algorithm for Data-Driven Computations of Measure-Preserving
Dynamical Systems [0.0]
We introduce measure-preserving extended dynamic mode decomposition ($textttmpEDMD$), the first truncation method whose eigendecomposition converges to the spectral quantities of Koopman operators.
$textttmpEDMD$ is flexible and easy to use with any pre-existing DMD-type method, and with different types of data.
arXiv Detail & Related papers (2022-09-06T06:37:54Z) - Riemannian Score-Based Generative Modeling [56.20669989459281]
We introduce score-based generative models (SGMs) demonstrating remarkable empirical performance.
Current SGMs make the underlying assumption that the data is supported on a Euclidean manifold with flat geometry.
This prevents the use of these models for applications in robotics, geoscience or protein modeling.
arXiv Detail & Related papers (2022-02-06T11:57:39Z) - Inverting brain grey matter models with likelihood-free inference: a
tool for trustable cytoarchitecture measurements [62.997667081978825]
characterisation of the brain grey matter cytoarchitecture with quantitative sensitivity to soma density and volume remains an unsolved challenge in dMRI.
We propose a new forward model, specifically a new system of equations, requiring a few relatively sparse b-shells.
We then apply modern tools from Bayesian analysis known as likelihood-free inference (LFI) to invert our proposed model.
arXiv Detail & Related papers (2021-11-15T09:08:27Z) - Optimal policy evaluation using kernel-based temporal difference methods [78.83926562536791]
We use kernel Hilbert spaces for estimating the value function of an infinite-horizon discounted Markov reward process.
We derive a non-asymptotic upper bound on the error with explicit dependence on the eigenvalues of the associated kernel operator.
We prove minimax lower bounds over sub-classes of MRPs.
arXiv Detail & Related papers (2021-09-24T14:48:20Z) - Mean-Square Analysis with An Application to Optimal Dimension Dependence
of Langevin Monte Carlo [60.785586069299356]
This work provides a general framework for the non-asymotic analysis of sampling error in 2-Wasserstein distance.
Our theoretical analysis is further validated by numerical experiments.
arXiv Detail & Related papers (2021-09-08T18:00:05Z) - Nonparametric approximation of conditional expectation operators [0.3655021726150368]
We investigate the approximation of the $L2$-operator defined by $[Pf](x) := mathbbE[ f(Y) mid X = x ]$ under minimal assumptions.
We prove that $P$ can be arbitrarily well approximated in operator norm by Hilbert-Schmidt operators acting on a reproducing kernel space.
arXiv Detail & Related papers (2020-12-23T19:06:12Z) - SLEIPNIR: Deterministic and Provably Accurate Feature Expansion for
Gaussian Process Regression with Derivatives [86.01677297601624]
We propose a novel approach for scaling GP regression with derivatives based on quadrature Fourier features.
We prove deterministic, non-asymptotic and exponentially fast decaying error bounds which apply for both the approximated kernel as well as the approximated posterior.
arXiv Detail & Related papers (2020-03-05T14:33:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.