Normalizing Diffusion Kernels with Optimal Transport
- URL: http://arxiv.org/abs/2507.06161v1
- Date: Tue, 08 Jul 2025 16:42:09 GMT
- Title: Normalizing Diffusion Kernels with Optimal Transport
- Authors: Nathan Kessler, Robin Magnet, Jean Feydy,
- Abstract summary: We introduce a class of smoothing operators that inherit desirable properties from Laplacians.<n>This construction enables Laplacian-like smoothing and processing of irregular data.<n>We show that the resulting operators approximate heat diffusion but also retain spectral information from the Laplacian itself.
- Score: 4.081238502499229
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Smoothing a signal based on local neighborhoods is a core operation in machine learning and geometry processing. On well-structured domains such as vector spaces and manifolds, the Laplace operator derived from differential geometry offers a principled approach to smoothing via heat diffusion, with strong theoretical guarantees. However, constructing such Laplacians requires a carefully defined domain structure, which is not always available. Most practitioners thus rely on simple convolution kernels and message-passing layers, which are biased against the boundaries of the domain. We bridge this gap by introducing a broad class of smoothing operators, derived from general similarity or adjacency matrices, and demonstrate that they can be normalized into diffusion-like operators that inherit desirable properties from Laplacians. Our approach relies on a symmetric variant of the Sinkhorn algorithm, which rescales positive smoothing operators to match the structural behavior of heat diffusion. This construction enables Laplacian-like smoothing and processing of irregular data such as point clouds, sparse voxel grids or mixture of Gaussians. We show that the resulting operators not only approximate heat diffusion but also retain spectral information from the Laplacian itself, with applications to shape analysis and matching.
Related papers
- GMapLatent: Geometric Mapping in Latent Space [51.317738404571514]
Cross-domain generative models based on encoder-decoder AI architectures have attracted much attention in generating realistic images.<n>We introduce a canonical latent space representation based on geometric mapping to align the cross-domain latent spaces in a rigorous and precise manner.<n>Experiments on gray-scale and color images validate the efficiency, efficacy and applicability of GMapLatent.
arXiv Detail & Related papers (2025-03-30T12:02:36Z) - Stable Nonconvex-Nonconcave Training via Linear Interpolation [51.668052890249726]
This paper presents a theoretical analysis of linearahead as a principled method for stabilizing (large-scale) neural network training.
We argue that instabilities in the optimization process are often caused by the nonmonotonicity of the loss landscape and show how linear can help by leveraging the theory of nonexpansive operators.
arXiv Detail & Related papers (2023-10-20T12:45:12Z) - Towards Training Without Depth Limits: Batch Normalization Without
Gradient Explosion [83.90492831583997]
We show that a batch-normalized network can keep the optimal signal propagation properties, but avoid exploding gradients in depth.
We use a Multi-Layer Perceptron (MLP) with linear activations and batch-normalization that provably has bounded depth.
We also design an activation shaping scheme that empirically achieves the same properties for certain non-linear activations.
arXiv Detail & Related papers (2023-10-03T12:35:02Z) - Posterior Contraction Rates for Mat\'ern Gaussian Processes on
Riemannian Manifolds [51.68005047958965]
We show that intrinsic Gaussian processes can achieve better performance in practice.
Our work shows that finer-grained analyses are needed to distinguish between different levels of data-efficiency.
arXiv Detail & Related papers (2023-09-19T20:30:58Z) - Kernelized Diffusion maps [2.817412580574242]
In this article, we build a different estimator of the Laplacian, via a reproducing kernel Hilbert space method.
We provide non-asymptotic statistical rates proving that the kernel estimator we build can circumvent the curse of dimensionality.
arXiv Detail & Related papers (2023-02-13T23:54:36Z) - Intrinsic Bayesian Optimisation on Complex Constrained Domain [4.164327213986953]
Motivated by the success of Bayesian optimisation algorithms in the Euclidean space, we propose a novel approach to construct Intrinsic Bayesian optimisation (In-BO)
Data may be collected in a spatial domain but restricted to a complex or intricately structured region corresponding to a geographic feature, such as lakes.
The efficiency of In-BO is demonstrated through simulation studies on a U-shaped domain, a Bitten torus, and a real dataset from the Aral sea.
arXiv Detail & Related papers (2023-01-29T23:28:08Z) - Isotropic Gaussian Processes on Finite Spaces of Graphs [71.26737403006778]
We propose a principled way to define Gaussian process priors on various sets of unweighted graphs.
We go further to consider sets of equivalence classes of unweighted graphs and define the appropriate versions of priors thereon.
Inspired by applications in chemistry, we illustrate the proposed techniques on a real molecular property prediction task in the small data regime.
arXiv Detail & Related papers (2022-11-03T10:18:17Z) - Capturing Graphs with Hypo-Elliptic Diffusions [7.704064306361941]
We show that the distribution of random walks evolves according to a diffusion equation defined using the graph Laplacian.
This results in a novel tensor-valued graph operator, which we call the hypo-elliptic graph Laplacian.
We show that this method competes with graph transformers on datasets requiring long-range reasoning but scales only linearly in the number of edges.
arXiv Detail & Related papers (2022-05-27T16:47:34Z) - Outlier Detection for Trajectories via Flow-embeddings [2.66418345185993]
We propose a method to detect outliers in empirically observed trajectories on a discretized manifold modeled by a simplicial complex.
Our approach is similar to spectral embeddings such as diffusion-maps and Laplacian eigenmaps, that construct embeddings from the eigenvectors of the graph Laplacian associated with low eigenvalues.
We show how this technique can single out trajectories that behave (topologically) different compared to typical trajectories, and illustrate the performance of our approach with both synthetic and empirical data.
arXiv Detail & Related papers (2021-11-25T19:58:48Z) - Local optimization on pure Gaussian state manifolds [63.76263875368856]
We exploit insights into the geometry of bosonic and fermionic Gaussian states to develop an efficient local optimization algorithm.
The method is based on notions of descent gradient attuned to the local geometry.
We use the presented methods to collect numerical and analytical evidence for the conjecture that Gaussian purifications are sufficient to compute the entanglement of purification of arbitrary mixed Gaussian states.
arXiv Detail & Related papers (2020-09-24T18:00:36Z) - Spectral convergence of diffusion maps: improved error bounds and an
alternative normalisation [0.6091702876917281]
This paper uses new approaches to improve the error bounds in the model case where the distribution is supported on a hypertorus.
We match long-standing pointwise error bounds for both the spectral data and the norm convergence of the operator discretisation.
We also introduce an alternative normalisation for diffusion maps based on Sinkhorn weights.
arXiv Detail & Related papers (2020-06-03T04:23:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.