Repulsive Monte Carlo on the sphere for the sliced Wasserstein distance
- URL: http://arxiv.org/abs/2509.10166v1
- Date: Fri, 12 Sep 2025 11:48:11 GMT
- Title: Repulsive Monte Carlo on the sphere for the sliced Wasserstein distance
- Authors: Vladimir Petrovic, Rémi Bardenet, Agnès Desolneux,
- Abstract summary: We consider the problem of computing the integral of a function on the unit sphere, in any dimension, using Monte Carlo methods.<n>Our guiding thread is the sliced Wasserstein distance between two measures on $mathbbRd$, which is precisely an integral on the $d$-dimensional sphere.
- Score: 4.052758394413725
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we consider the problem of computing the integral of a function on the unit sphere, in any dimension, using Monte Carlo methods. Although the methods we present are general, our guiding thread is the sliced Wasserstein distance between two measures on $\mathbb{R}^d$, which is precisely an integral on the $d$-dimensional sphere. The sliced Wasserstein distance (SW) has gained momentum in machine learning either as a proxy to the less computationally tractable Wasserstein distance, or as a distance in its own right, due in particular to its built-in alleviation of the curse of dimensionality. There has been recent numerical benchmarks of quadratures for the sliced Wasserstein, and our viewpoint differs in that we concentrate on quadratures where the nodes are repulsive, i.e. negatively dependent. Indeed, negative dependence can bring variance reduction when the quadrature is adapted to the integration task. Our first contribution is to extract and motivate quadratures from the recent literature on determinantal point processes (DPPs) and repelled point processes, as well as repulsive quadratures from the literature specific to the sliced Wasserstein distance. We then numerically benchmark these quadratures. Moreover, we analyze the variance of the UnifOrtho estimator, an orthogonal Monte Carlo estimator. Our analysis sheds light on UnifOrtho's success for the estimation of the sliced Wasserstein in large dimensions, as well as counterexamples from the literature. Our final recommendation for the computation of the sliced Wasserstein distance is to use randomized quasi-Monte Carlo in low dimensions and \emph{UnifOrtho} in large dimensions. DPP-based quadratures only shine when quasi-Monte Carlo also does, while repelled quadratures show moderate variance reduction in general, but more theoretical effort is needed to make them robust.
Related papers
- Slicing Wasserstein Over Wasserstein Via Functional Optimal Transport [2.649859884914447]
Wasserstein distances define a metric between probability measures on arbitrary metric spaces.<n>Existing sliced WoW accelerations rely on parametric meta-measures or the existence of high-order moments.<n>We show that DSW minimization is equivalent to WoW minimization for discretized meta-measures.
arXiv Detail & Related papers (2025-09-26T09:59:14Z) - Fast Estimation of Wasserstein Distances via Regression on Sliced Wasserstein Distances [70.94157767200342]
We propose a fast estimation method based on regressing Wasserstein distance on sliced Wasserstein distances.<n>We show that accurate models can be learned from a small number of distribution pairs.<n>Our method consistently provides a better approximation of Wasserstein distance than the state-of-the-art Wasserstein embedding model, Wasserstein Wormhole.
arXiv Detail & Related papers (2025-09-24T19:30:53Z) - Fast Approximation of the Generalized Sliced-Wasserstein Distance [13.840237470871406]
Generalized sliced Wasserstein distance is a variant of sliced Wasserstein distance.
We propose to form deterministic and fast approximations of generalized sliced Wasserstein distance.
arXiv Detail & Related papers (2022-10-19T03:15:50Z) - Centered plug-in estimation of Wasserstein distances [0.0]
The plug-in estimator of the squared Euclidean 2-Wasserstein distance is conservative, however due to its large positive bias it is often uninformative.<n>We construct a pair of centered plug-in estimators that decrease with the true Wasserstein distance, and are therefore guaranteed to be informative, for any finite sample size.
arXiv Detail & Related papers (2022-03-22T11:26:18Z) - Mean-Square Analysis with An Application to Optimal Dimension Dependence
of Langevin Monte Carlo [60.785586069299356]
This work provides a general framework for the non-asymotic analysis of sampling error in 2-Wasserstein distance.
Our theoretical analysis is further validated by numerical experiments.
arXiv Detail & Related papers (2021-09-08T18:00:05Z) - Learning High Dimensional Wasserstein Geodesics [55.086626708837635]
We propose a new formulation and learning strategy for computing the Wasserstein geodesic between two probability distributions in high dimensions.
By applying the method of Lagrange multipliers to the dynamic formulation of the optimal transport (OT) problem, we derive a minimax problem whose saddle point is the Wasserstein geodesic.
We then parametrize the functions by deep neural networks and design a sample based bidirectional learning algorithm for training.
arXiv Detail & Related papers (2021-02-05T04:25:28Z) - Continuous Wasserstein-2 Barycenter Estimation without Minimax
Optimization [94.18714844247766]
Wasserstein barycenters provide a geometric notion of the weighted average of probability measures based on optimal transport.
We present a scalable algorithm to compute Wasserstein-2 barycenters given sample access to the input measures.
arXiv Detail & Related papers (2021-02-02T21:01:13Z) - Two-sample Test using Projected Wasserstein Distance [18.46110328123008]
We develop a projected Wasserstein distance for the two-sample test, a fundamental problem in statistics and machine learning.
A key contribution is to couple optimal projection to find the low dimensional linear mapping to maximize the Wasserstein distance between projected probability distributions.
arXiv Detail & Related papers (2020-10-22T18:08:58Z) - On Projection Robust Optimal Transport: Sample Complexity and Model
Misspecification [101.0377583883137]
Projection robust (PR) OT seeks to maximize the OT cost between two measures by choosing a $k$-dimensional subspace onto which they can be projected.
Our first contribution is to establish several fundamental statistical properties of PR Wasserstein distances.
Next, we propose the integral PR Wasserstein (IPRW) distance as an alternative to the PRW distance, by averaging rather than optimizing on subspaces.
arXiv Detail & Related papers (2020-06-22T14:35:33Z) - A diffusion approach to Stein's method on Riemannian manifolds [65.36007959755302]
We exploit the relationship between the generator of a diffusion on $mathbf M$ with target invariant measure and its characterising Stein operator.
We derive Stein factors, which bound the solution to the Stein equation and its derivatives.
We imply that the bounds for $mathbb Rm$ remain valid when $mathbf M$ is a flat manifold.
arXiv Detail & Related papers (2020-03-25T17:03:58Z) - Partial Optimal Transport with Applications on Positive-Unlabeled
Learning [12.504473943407092]
We propose exact algorithms to solve Wasserstein and Gromov-Wasserstein problems.
This is the first application of optimal transport in this context.
arXiv Detail & Related papers (2020-02-19T16:36:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.