Blessing of Dimensionality for Approximating Sobolev Classes on Manifolds
- URL: http://arxiv.org/abs/2408.06996v1
- Date: Tue, 13 Aug 2024 15:56:42 GMT
- Title: Blessing of Dimensionality for Approximating Sobolev Classes on Manifolds
- Authors: Hong Ye Tan, Subhadip Mukherjee, Junqi Tang, Carola-Bibiane Schönlieb,
- Abstract summary: The manifold hypothesis says that natural high-dimensional data is supported on or around a low-dimensional manifold.
Recent success of statistical and learning-based methods empirically supports this hypothesis.
We provide theoretical statistical complexity results, which directly relates to generalization properties.
- Score: 14.183849746284816
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The manifold hypothesis says that natural high-dimensional data is actually supported on or around a low-dimensional manifold. Recent success of statistical and learning-based methods empirically supports this hypothesis, due to outperforming classical statistical intuition in very high dimensions. A natural step for analysis is thus to assume the manifold hypothesis and derive bounds that are independent of any embedding space. Theoretical implications in this direction have recently been explored in terms of generalization of ReLU networks and convergence of Langevin methods. We complement existing results by providing theoretical statistical complexity results, which directly relates to generalization properties. In particular, we demonstrate that the statistical complexity required to approximate a class of bounded Sobolev functions on a compact manifold is bounded from below, and moreover that this bound is dependent only on the intrinsic properties of the manifold. These provide complementary bounds for existing approximation results for ReLU networks on manifolds, which give upper bounds on generalization capacity.
Related papers
- A Likelihood Based Approach to Distribution Regression Using Conditional Deep Generative Models [6.647819824559201]
We study the large-sample properties of a likelihood-based approach for estimating conditional deep generative models.
Our results lead to the convergence rate of a sieve maximum likelihood estimator for estimating the conditional distribution.
arXiv Detail & Related papers (2024-10-02T20:46:21Z) - Finite-dimensional approximations of push-forwards on locally analytic functionals [5.787117733071417]
Our approach is to consider the push-forward on the space of locally analytic functionals, instead of directly handling the analytic map itself.
We establish a methodology enabling appropriate finite-dimensional approximation of the push-forward from finite discrete data.
arXiv Detail & Related papers (2024-04-16T17:53:59Z) - Conformal inference for regression on Riemannian Manifolds [49.7719149179179]
We investigate prediction sets for regression scenarios when the response variable, denoted by $Y$, resides in a manifold, and the covariable, denoted by X, lies in Euclidean space.
We prove the almost sure convergence of the empirical version of these regions on the manifold to their population counterparts.
arXiv Detail & Related papers (2023-10-12T10:56:25Z) - Algebraic and Statistical Properties of the Ordinary Least Squares Interpolator [3.4320157633663064]
We provide results for the minimum $ell$-norm OLS interpolator.
We present statistical results such as an extension of the Gauss-Markov theorem.
We conduct simulations that further explore the properties of the OLS interpolator.
arXiv Detail & Related papers (2023-09-27T16:41:10Z) - Curvature-Independent Last-Iterate Convergence for Games on Riemannian
Manifolds [77.4346324549323]
We show that a step size agnostic to the curvature of the manifold achieves a curvature-independent and linear last-iterate convergence rate.
To the best of our knowledge, the possibility of curvature-independent rates and/or last-iterate convergence has not been considered before.
arXiv Detail & Related papers (2023-06-29T01:20:44Z) - Diffusion Models are Minimax Optimal Distribution Estimators [49.47503258639454]
We provide the first rigorous analysis on approximation and generalization abilities of diffusion modeling.
We show that when the true density function belongs to the Besov space and the empirical score matching loss is properly minimized, the generated data distribution achieves the nearly minimax optimal estimation rates.
arXiv Detail & Related papers (2023-03-03T11:31:55Z) - Kernel-based off-policy estimation without overlap: Instance optimality
beyond semiparametric efficiency [53.90687548731265]
We study optimal procedures for estimating a linear functional based on observational data.
For any convex and symmetric function class $mathcalF$, we derive a non-asymptotic local minimax bound on the mean-squared error.
arXiv Detail & Related papers (2023-01-16T02:57:37Z) - Instance-Dependent Generalization Bounds via Optimal Transport [51.71650746285469]
Existing generalization bounds fail to explain crucial factors that drive the generalization of modern neural networks.
We derive instance-dependent generalization bounds that depend on the local Lipschitz regularity of the learned prediction function in the data space.
We empirically analyze our generalization bounds for neural networks, showing that the bound values are meaningful and capture the effect of popular regularization methods during training.
arXiv Detail & Related papers (2022-11-02T16:39:42Z) - Statistical exploration of the Manifold Hypothesis [10.389701595098922]
The Manifold Hypothesis asserts that nominally high-dimensional data are in fact concentrated near a low-dimensional manifold, embedded in high-dimensional space.
We show that rich and sometimes intricate manifold structure in data can emerge from a generic and remarkably simple statistical model.
We derive procedures to discover and interpret the geometry of high-dimensional data, and explore hypotheses about the data generating mechanism.
arXiv Detail & Related papers (2022-08-24T17:00:16Z) - Partial Counterfactual Identification from Observational and
Experimental Data [83.798237968683]
We develop effective Monte Carlo algorithms to approximate the optimal bounds from an arbitrary combination of observational and experimental data.
Our algorithms are validated extensively on synthetic and real-world datasets.
arXiv Detail & Related papers (2021-10-12T02:21:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.