Lipschitz regularity of graph Laplacians on random data clouds
- URL: http://arxiv.org/abs/2007.06679v2
- Date: Wed, 20 Oct 2021 22:18:57 GMT
- Title: Lipschitz regularity of graph Laplacians on random data clouds
- Authors: Jeff Calder and Nicolas Garcia Trillos and Marta Lewicka
- Abstract summary: We prove high probability interior and global Lipschitz estimates for solutions of graph Poisson equations.
Our results can be used to show that graph Laplacian eigenvectors are, with high probability, essentially Lipschitz regular with constants depending explicitly on their corresponding eigenvalues.
- Score: 1.2891210250935146
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper we study Lipschitz regularity of elliptic PDEs on geometric
graphs, constructed from random data points. The data points are sampled from a
distribution supported on a smooth manifold. The family of equations that we
study arises in data analysis in the context of graph-based learning and
contains, as important examples, the equations satisfied by graph Laplacian
eigenvectors. In particular, we prove high probability interior and global
Lipschitz estimates for solutions of graph Poisson equations. Our results can
be used to show that graph Laplacian eigenvectors are, with high probability,
essentially Lipschitz regular with constants depending explicitly on their
corresponding eigenvalues. Our analysis relies on a probabilistic coupling
argument of suitable random walks at the continuum level, and an interpolation
method for extending functions on random point clouds to the continuum
manifold. As a byproduct of our general regularity results, we obtain high
probability $L^\infty$ and approximate $\mathcal{C}^{0,1}$ convergence rates
for the convergence of graph Laplacian eigenvectors towards eigenfunctions of
the corresponding weighted Laplace-Beltrami operators. The convergence rates we
obtain scale like the $L^2$-convergence rates established by two of the authors
in previous work.
Related papers
- Convergence rates for Poisson learning to a Poisson equation with measure data [2.7961972519572447]
We prove discrete to continuum convergence rates for Poisson Learning, a graph-based semi-supervised learning algorithm.
We show how to regularize the graph Poisson equation via mollification with the graph heat kernel.
We obtain convergence rates that scale up to logarithmic factors, like $O(varepsilonfrac1d+2)$ for general data distributions, and $O(varepsilonfrac2-sigmad+4)$ for uniformly distributed data.
arXiv Detail & Related papers (2024-07-09T11:54:34Z) - Bi-stochastically normalized graph Laplacian: convergence to manifold Laplacian and robustness to outlier noise [10.418647759223965]
Bi-stochastic normalization provides an alternative normalization of graph Laplacians in graph-based data analysis.
We prove the convergence of bi-stochastically normalized graph Laplacian to manifold (weighted-)Laplacian with rates.
When the manifold data are corrupted by outlier noise, we theoretically prove the graph Laplacian point-wise consistency.
arXiv Detail & Related papers (2022-06-22T21:08:24Z) - A Law of Robustness beyond Isoperimetry [84.33752026418045]
We prove a Lipschitzness lower bound $Omega(sqrtn/p)$ of robustness of interpolating neural network parameters on arbitrary distributions.
We then show the potential benefit of overparametrization for smooth data when $n=mathrmpoly(d)$.
We disprove the potential existence of an $O(1)$-Lipschitz robust interpolating function when $n=exp(omega(d))$.
arXiv Detail & Related papers (2022-02-23T16:10:23Z) - Nonconvex Stochastic Scaled-Gradient Descent and Generalized Eigenvector
Problems [98.34292831923335]
Motivated by the problem of online correlation analysis, we propose the emphStochastic Scaled-Gradient Descent (SSD) algorithm.
We bring these ideas together in an application to online correlation analysis, deriving for the first time an optimal one-time-scale algorithm with an explicit rate of local convergence to normality.
arXiv Detail & Related papers (2021-12-29T18:46:52Z) - Learning Sparse Graph with Minimax Concave Penalty under Gaussian Markov
Random Fields [51.07460861448716]
This paper presents a convex-analytic framework to learn from data.
We show that a triangular convexity decomposition is guaranteed by a transform of the corresponding to its upper part.
arXiv Detail & Related papers (2021-09-17T17:46:12Z) - Spectral clustering under degree heterogeneity: a case for the random
walk Laplacian [83.79286663107845]
This paper shows that graph spectral embedding using the random walk Laplacian produces vector representations which are completely corrected for node degree.
In the special case of a degree-corrected block model, the embedding concentrates about K distinct points, representing communities.
arXiv Detail & Related papers (2021-05-03T16:36:27Z) - Eigen-convergence of Gaussian kernelized graph Laplacian by manifold
heat interpolation [16.891059233061767]
We study the spectral convergence of graph Laplacian to the Laplace-Beltrami operator.
Data are uniformly sampled on a $d$-dimensional manifold.
We prove new point-wise and Dirichlet form convergence rates for the density-corrected graph Laplacian.
arXiv Detail & Related papers (2021-01-25T03:22:18Z) - Random Geometric Graphs on Euclidean Balls [2.28438857884398]
We consider a latent space model for random graphs where a node $i$ is associated to a random latent point $X_i$ on the Euclidean unit ball.
For certain link functions, the model considered here generates graphs with degree distribution that have tails with a power-law-type distribution.
arXiv Detail & Related papers (2020-10-26T17:21:57Z) - Tight Nonparametric Convergence Rates for Stochastic Gradient Descent
under the Noiseless Linear Model [0.0]
We analyze the convergence of single-pass, fixed step-size gradient descent on the least-square risk under this model.
As a special case, we analyze an online algorithm for estimating a real function on the unit interval from the noiseless observation of its value at randomly sampled points.
arXiv Detail & Related papers (2020-06-15T08:25:50Z) - A Random Matrix Analysis of Random Fourier Features: Beyond the Gaussian
Kernel, a Precise Phase Transition, and the Corresponding Double Descent [85.77233010209368]
This article characterizes the exacts of random Fourier feature (RFF) regression, in the realistic setting where the number of data samples $n$ is all large and comparable.
This analysis also provides accurate estimates of training and test regression errors for large $n,p,N$.
arXiv Detail & Related papers (2020-06-09T02:05:40Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.