How well behaved is finite dimensional Diffusion Maps?
- URL: http://arxiv.org/abs/2412.03992v1
- Date: Thu, 05 Dec 2024 09:12:25 GMT
- Title: How well behaved is finite dimensional Diffusion Maps?
- Authors: Wenyu Bo, Marina Meilă,
- Abstract summary: We derive a series of properties that remain valid after finite-dimensional and almost isometric Diffusion Maps (DM)
We establish rigorous bounds on the embedding errors introduced by the DM algorithm is $Oleft(fraclog nn)frac18d+16right$.
These results offer a solid theoretical foundation for understanding the performance and reliability of DM in practical applications.
- Score: 0.0
- License:
- Abstract: Under a set of assumptions on a family of submanifolds $\subset {\mathbb R}^D$, we derive a series of geometric properties that remain valid after finite-dimensional and almost isometric Diffusion Maps (DM), including almost uniform density, finite polynomial approximation and local reach. Leveraging these properties, we establish rigorous bounds on the embedding errors introduced by the DM algorithm is $O\left((\frac{\log n}{n})^{\frac{1}{8d+16}}\right)$. These results offer a solid theoretical foundation for understanding the performance and reliability of DM in practical applications.
Related papers
- On lower bounds of the density of planar periodic sets without unit distances [55.2480439325792]
We introduce a novel approach to estimating $m_1(mathbbR2)$ by reformulating the problem as a Maximal Independent Set (MIS) problem on graphs constructed from flat torus.
Our experimental results supported by theoretical justifications of proposed method demonstrate that for a sufficiently wide range of parameters this approach does not improve the known lower bound.
arXiv Detail & Related papers (2024-11-20T12:07:19Z) - Curvature-Independent Last-Iterate Convergence for Games on Riemannian
Manifolds [77.4346324549323]
We show that a step size agnostic to the curvature of the manifold achieves a curvature-independent and linear last-iterate convergence rate.
To the best of our knowledge, the possibility of curvature-independent rates and/or last-iterate convergence has not been considered before.
arXiv Detail & Related papers (2023-06-29T01:20:44Z) - Diffusion Maps : Using the Semigroup Property for Parameter Tuning [1.8782750537161608]
Diffusion maps (DM) are used to reduce data lying on or close to a low-dimensional manifold embedded in a much larger dimensional space.
We address the problem of setting a diffusion time t when constructing the diffusion kernel matrix by using the semigroup property of the diffusion operator.
Experiments show that this principled approach is effective and robust.
arXiv Detail & Related papers (2022-03-06T03:02:24Z) - Exponential Convergence of Deep Operator Networks for Elliptic Partial
Differential Equations [0.0]
We construct deep operator networks (ONets) between infinite-dimensional spaces that emulate with an exponential rate of convergence the coefficient-to-solution map of elliptic second-order PDEs.
In particular, we consider problems set in $d$-dimensional periodic domains, $d=1, 2, dots$, and with analytic right-hand sides and coefficients.
We prove that the neural networks in the ONet have size $mathcalO(left|log(varepsilon)right|kappa)$ for some $kappa
arXiv Detail & Related papers (2021-12-15T13:56:28Z) - Mean-Square Analysis with An Application to Optimal Dimension Dependence
of Langevin Monte Carlo [60.785586069299356]
This work provides a general framework for the non-asymotic analysis of sampling error in 2-Wasserstein distance.
Our theoretical analysis is further validated by numerical experiments.
arXiv Detail & Related papers (2021-09-08T18:00:05Z) - A Unifying and Canonical Description of Measure-Preserving Diffusions [60.59592461429012]
A complete recipe of measure-preserving diffusions in Euclidean space was recently derived unifying several MCMC algorithms into a single framework.
We develop a geometric theory that improves and generalises this construction to any manifold.
arXiv Detail & Related papers (2021-05-06T17:36:55Z) - Multiscale regression on unknown manifolds [13.752772802705978]
We construct low-dimensional coordinates on $mathcalM$ at multiple scales and perform multiscale regression by local fitting.
We analyze the generalization error of our method by proving finite sample bounds in high probability on rich classes of priors.
Our algorithm has quasilinear complexity in the sample size, with constants linear in $D$ and exponential in $d$.
arXiv Detail & Related papers (2021-01-13T15:14:31Z) - A deep network construction that adapts to intrinsic dimensionality
beyond the domain [79.23797234241471]
We study the approximation of two-layer compositions $f(x) = g(phi(x))$ via deep networks with ReLU activation.
We focus on two intuitive and practically relevant choices for $phi$: the projection onto a low-dimensional embedded submanifold and a distance to a collection of low-dimensional sets.
arXiv Detail & Related papers (2020-08-06T09:50:29Z) - General state transitions with exact resource morphisms: a unified
resource-theoretic approach [2.28438857884398]
We formulate conditions that guarantee the existence of an $mathsfF$-morphism between two density matrices.
While we allow errors in the transition, the corresponding map is required to be an exact $mathsfF$-morphism.
We show how, when specialized to some situations of physical interest, our general results are able to unify and extend previous analyses.
arXiv Detail & Related papers (2020-05-19T03:20:39Z) - On Linear Stochastic Approximation: Fine-grained Polyak-Ruppert and
Non-Asymptotic Concentration [115.1954841020189]
We study the inequality and non-asymptotic properties of approximation procedures with Polyak-Ruppert averaging.
We prove a central limit theorem (CLT) for the averaged iterates with fixed step size and number of iterations going to infinity.
arXiv Detail & Related papers (2020-04-09T17:54:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.