Spectral Algorithms on Manifolds through Diffusion
- URL: http://arxiv.org/abs/2403.03669v2
- Date: Thu, 7 Mar 2024 09:20:35 GMT
- Title: Spectral Algorithms on Manifolds through Diffusion
- Authors: Weichun Xia and Lei Shi
- Abstract summary: We study the convergence performance of spectral algorithms in the Reproducing Kernel Space.
We employ integral operator techniques to derive tight convergence upper bounds concerning generalized norms.
Our study confirms that the spectral algorithms are practically significant in the broader context of high-dimensional approximation.
- Score: 1.7227952883644062
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The existing research on spectral algorithms, applied within a Reproducing
Kernel Hilbert Space (RKHS), has primarily focused on general kernel functions,
often neglecting the inherent structure of the input feature space. Our paper
introduces a new perspective, asserting that input data are situated within a
low-dimensional manifold embedded in a higher-dimensional Euclidean space. We
study the convergence performance of spectral algorithms in the RKHSs,
specifically those generated by the heat kernels, known as diffusion spaces.
Incorporating the manifold structure of the input, we employ integral operator
techniques to derive tight convergence upper bounds concerning generalized
norms, which indicates that the estimators converge to the target function in
strong sense, entailing the simultaneous convergence of the function itself and
its derivatives. These bounds offer two significant advantages: firstly, they
are exclusively contingent on the intrinsic dimension of the input manifolds,
thereby providing a more focused analysis. Secondly, they enable the efficient
derivation of convergence rates for derivatives of any k-th order, all of which
can be accomplished within the ambit of the same spectral algorithms.
Furthermore, we establish minimax lower bounds to demonstrate the asymptotic
optimality of these conclusions in specific contexts. Our study confirms that
the spectral algorithms are practically significant in the broader context of
high-dimensional approximation.
Related papers
- Diffusion-based Semi-supervised Spectral Algorithm for Regression on Manifolds [2.0649432688817444]
We introduce a novel diffusion-based spectral algorithm to tackle regression analysis on high-dimensional data.
Our method uses the local estimation property of heat kernel, offering an adaptive, data-driven approach to overcome this obstacle.
Our algorithm performs in an entirely data-driven manner, operating directly within the intrinsic manifold structure of the data.
arXiv Detail & Related papers (2024-10-18T15:29:04Z) - Sample Complexity for Quadratic Bandits: Hessian Dependent Bounds and
Optimal Algorithms [64.10576998630981]
We show the first tight characterization of the optimal Hessian-dependent sample complexity.
A Hessian-independent algorithm universally achieves the optimal sample complexities for all Hessian instances.
The optimal sample complexities achieved by our algorithm remain valid for heavy-tailed noise distributions.
arXiv Detail & Related papers (2023-06-21T17:03:22Z) - The Galerkin method beats Graph-Based Approaches for Spectral Algorithms [3.5897534810405403]
We break with the machine learning community and prove the statistical and computational superiority of the Galerkin method.
We introduce implementation tricks to deal with differential operators in large dimensions with structured kernels.
We extend on the core principles beyond our approach to apply them to non-linear spaces of functions, such as the ones parameterized by deep neural networks.
arXiv Detail & Related papers (2023-06-01T14:38:54Z) - Neural Set Function Extensions: Learning with Discrete Functions in High
Dimensions [63.21838830509772]
We develop a framework for extending set functions onto low-dimensional continuous domains.
Our framework subsumes many well-known extensions as special cases.
We convert low-dimensional neural network bottlenecks into representations in high-dimensional spaces.
arXiv Detail & Related papers (2022-08-08T10:58:02Z) - Structural aspects of FRG in quantum tunnelling computations [68.8204255655161]
We probe both the unidimensional quartic harmonic oscillator and the double well potential.
Two partial differential equations for the potential V_k(varphi) and the wave function renormalization Z_k(varphi) are studied.
arXiv Detail & Related papers (2022-06-14T15:23:25Z) - Reinforcement Learning from Partial Observation: Linear Function Approximation with Provable Sample Efficiency [111.83670279016599]
We study reinforcement learning for partially observed decision processes (POMDPs) with infinite observation and state spaces.
We make the first attempt at partial observability and function approximation for a class of POMDPs with a linear structure.
arXiv Detail & Related papers (2022-04-20T21:15:38Z) - Convex Analysis of the Mean Field Langevin Dynamics [49.66486092259375]
convergence rate analysis of the mean field Langevin dynamics is presented.
$p_q$ associated with the dynamics allows us to develop a convergence theory parallel to classical results in convex optimization.
arXiv Detail & Related papers (2022-01-25T17:13:56Z) - Kernel Interpolation of High Dimensional Scattered Data [22.857190042428922]
Data sites selected from modeling high-dimensional problems often appear scattered in non-paternalistic ways.
We propose and study in the current article a new framework to analyze kernel of high dimensional data, which features bounding approximation error by the spectrum of the underlying kernel matrix.
arXiv Detail & Related papers (2020-09-03T08:34:00Z) - Intrinsic Gaussian Processes on Manifolds and Their Accelerations by
Symmetry [9.773237080061815]
Existing methods primarily focus on low dimensional constrained domains for heat kernel estimation.
Our research proposes an intrinsic approach for constructing GP on general equations.
Our methodology estimates the heat kernel by simulating Brownian motion sample paths using the exponential map.
arXiv Detail & Related papers (2020-06-25T09:17:40Z) - Fast Objective & Duality Gap Convergence for Non-Convex Strongly-Concave
Min-Max Problems with PL Condition [52.08417569774822]
This paper focuses on methods for solving smooth non-concave min-max problems, which have received increasing attention due to deep learning (e.g., deep AUC)
arXiv Detail & Related papers (2020-06-12T00:32:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.