Laplace Learning in Wasserstein Space
- URL: http://arxiv.org/abs/2511.13229v1
- Date: Mon, 17 Nov 2025 10:49:36 GMT
- Title: Laplace Learning in Wasserstein Space
- Authors: Mary Chriselda Antony Oliver, Michael Roberts, Carola-Bibiane Schönlieb, Matthew Thorpe,
- Abstract summary: We assume manifold hypothesis to investigate graph-based semi-supervised learning methods.<n>In particular, we examine Laplace Learning in the Wasserstein space.<n>We prove variational convergence of a discrete graph p- Dirichlet energy to its continuum counterpart.
- Score: 20.33446919989862
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The manifold hypothesis posits that high-dimensional data typically resides on low-dimensional sub spaces. In this paper, we assume manifold hypothesis to investigate graph-based semi-supervised learning methods. In particular, we examine Laplace Learning in the Wasserstein space, extending the classical notion of graph-based semi-supervised learning algorithms from finite-dimensional Euclidean spaces to an infinite-dimensional setting. To achieve this, we prove variational convergence of a discrete graph p- Dirichlet energy to its continuum counterpart. In addition, we characterize the Laplace-Beltrami operator on asubmanifold of the Wasserstein space. Finally, we validate the proposed theoretical framework through numerical experiments conducted on benchmark datasets, demonstrating the consistency of our classification performance in high-dimensional settings.
Related papers
- Infinite dimensional generative sensing [0.9749560288448113]
This work presents a rigorous framework for generative compressed sensing in Hilbert spaces.<n>Thanks to a generalization of the Restricted Isometry Property, we show that stable recovery holds when the number of measurements is proportional to the prior's intrinsic dimension.
arXiv Detail & Related papers (2026-03-03T17:52:18Z) - Large Data Limits of Laplace Learning for Gaussian Measure Data in Infinite Dimensions [2.020917258669917]
Laplace learning is a solution for finding missing labels from a partially labeled dataset.<n>The Lebesgue measure on infinite-dimensional spaces requires the analysis if the data aren't finite-dimensional.
arXiv Detail & Related papers (2026-01-20T22:14:05Z) - Robust Tangent Space Estimation via Laplacian Eigenvector Gradient Orthogonalization [48.25304391127552]
Estimating the tangent spaces of a data manifold is a fundamental problem in data analysis.<n>We propose a method, Laplacian Eigenvector Gradient Orthogonalization (LEGO), that utilizes the global structure of the data to guide local tangent space estimation.
arXiv Detail & Related papers (2025-10-02T17:59:45Z) - Towards Coordinate- and Dimension-Agnostic Machine Learning for Partial Differential Equations [8.62968609670716]
We employ a machine learning approach to predict the evolution of scalar field systems expressed in the formalism of exterior calculus.<n>We show that the field dynamics learned in one space can be used to make accurate predictions in other spaces with different dimensions, coordinate systems, boundary conditions, and curvatures.
arXiv Detail & Related papers (2025-05-22T11:37:55Z) - Proper Latent Decomposition [4.266376725904727]
We compute a reduced set of intrinsic coordinates (latent space) to accurately describe a flow with fewer degrees of freedom than the numerical discretization.<n>With this proposed numerical framework, we propose an algorithm to perform PLD on the manifold.<n>This work opens opportunities for analyzing autoencoders and latent spaces, nonlinear reduced-order modeling and scientific insights into the structure of high-dimensional data.
arXiv Detail & Related papers (2024-12-01T12:19:08Z) - Understanding and Mitigating Hyperbolic Dimensional Collapse in Graph Contrastive Learning [70.0681902472251]
We propose a novel contrastive learning framework to learn high-quality graph embeddings in hyperbolic space.<n>Specifically, we design the alignment metric that effectively captures the hierarchical data-invariant information.<n>We show that in the hyperbolic space one has to address the leaf- and height-level uniformity related to properties of trees.
arXiv Detail & Related papers (2023-10-27T15:31:42Z) - A Heat Diffusion Perspective on Geodesic Preserving Dimensionality
Reduction [66.21060114843202]
We propose a more general heat kernel based manifold embedding method that we call heat geodesic embeddings.
Results show that our method outperforms existing state of the art in preserving ground truth manifold distances.
We also showcase our method on single cell RNA-sequencing datasets with both continuum and cluster structure.
arXiv Detail & Related papers (2023-05-30T13:58:50Z) - Normalizing flows for lattice gauge theory in arbitrary space-time
dimension [135.04925500053622]
Applications of normalizing flows to the sampling of field configurations in lattice gauge theory have so far been explored almost exclusively in two space-time dimensions.
We discuss masked autoregressive with tractable and unbiased Jacobian determinants, a key ingredient for scalable and exact flow-based sampling algorithms.
For concreteness, results from a proof-of-principle application to SU(3) gauge theory in four space-time dimensions are reported.
arXiv Detail & Related papers (2023-05-03T19:54:04Z) - Intrinsic dimension estimation for discrete metrics [65.5438227932088]
In this letter we introduce an algorithm to infer the intrinsic dimension (ID) of datasets embedded in discrete spaces.
We demonstrate its accuracy on benchmark datasets, and we apply it to analyze a metagenomic dataset for species fingerprinting.
This suggests that evolutive pressure acts on a low-dimensional manifold despite the high-dimensionality of sequences' space.
arXiv Detail & Related papers (2022-07-20T06:38:36Z) - A diffusion-map-based algorithm for gradient computation on manifolds
and applications [0.0]
We recover the gradient of a given function defined on interior points of a Riemannian submanifold in the Euclidean space.
This approach is based on the estimates of the Laplace-Beltrami operator proposed in the diffusion-maps theory.
arXiv Detail & Related papers (2021-08-16T09:35:22Z) - Manifold learning with arbitrary norms [8.433233101044197]
We show that manifold learning based on Earthmover's distances outperforms the standard Euclidean variant for learning molecular shape spaces.
We show in a numerical simulation that manifold learning based on Earthmover's distances outperforms the standard Euclidean variant for learning molecular shape spaces.
arXiv Detail & Related papers (2020-12-28T10:24:30Z) - A Framework for Fluid Motion Estimation using a Constraint-Based
Refinement Approach [0.0]
We formulate a general framework for fluid motion estimation using a constraint-based refinement approach.
We demonstrate that for a particular choice of constraint, our results closely approximate the classical continuity equation-based method for fluid flow.
We also observe a surprising connection to the Cauchy-Riemann operator that diagonalizes the system leading to a diffusive phenomenon involving the divergence and the curl of the flow.
arXiv Detail & Related papers (2020-11-24T18:23:39Z) - Manifold Learning via Manifold Deflation [105.7418091051558]
dimensionality reduction methods provide a valuable means to visualize and interpret high-dimensional data.
Many popular methods can fail dramatically, even on simple two-dimensional Manifolds.
This paper presents an embedding method for a novel, incremental tangent space estimator that incorporates global structure as coordinates.
Empirically, we show our algorithm recovers novel and interesting embeddings on real-world and synthetic datasets.
arXiv Detail & Related papers (2020-07-07T10:04:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.