Hilbert Curve Projection Distance for Distribution Comparison
- URL: http://arxiv.org/abs/2205.15059v4
- Date: Tue, 6 Feb 2024 10:58:38 GMT
- Title: Hilbert Curve Projection Distance for Distribution Comparison
- Authors: Tao Li, Cheng Meng, Hongteng Xu, Jun Yu
- Abstract summary: We propose a novel metric, called Hilbert curve projection (HCP) distance, to measure the distance between two probability distributions.
We show that HCP distance is a proper metric and is well-defined for probability measures with bounded supports.
Experiments on both synthetic and real-world data show that our HCP distance works as an effective surrogate of the Wasserstein distance with low complexity.
- Score: 34.8765820950517
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Distribution comparison plays a central role in many machine learning tasks
like data classification and generative modeling. In this study, we propose a
novel metric, called Hilbert curve projection (HCP) distance, to measure the
distance between two probability distributions with low complexity. In
particular, we first project two high-dimensional probability distributions
using Hilbert curve to obtain a coupling between them, and then calculate the
transport distance between these two distributions in the original space,
according to the coupling. We show that HCP distance is a proper metric and is
well-defined for probability measures with bounded supports. Furthermore, we
demonstrate that the modified empirical HCP distance with the $L_p$ cost in the
$d$-dimensional space converges to its population counterpart at a rate of no
more than $O(n^{-1/2\max\{d,p\}})$. To suppress the curse-of-dimensionality, we
also develop two variants of the HCP distance using (learnable) subspace
projections. Experiments on both synthetic and real-world data show that our
HCP distance works as an effective surrogate of the Wasserstein distance with
low complexity and overcomes the drawbacks of the sliced Wasserstein distance.
Related papers
- Relative-Translation Invariant Wasserstein Distance [82.6068808353647]
We introduce a new family of distances, relative-translation invariant Wasserstein distances ($RW_p$)
We show that $RW_p distances are also real distance metrics defined on the quotient set $mathcalP_p(mathbbRn)/sim$ invariant to distribution translations.
arXiv Detail & Related papers (2024-09-04T03:41:44Z) - Computing the Distance between unbalanced Distributions -- The flat
Metric [0.0]
The flat metric generalizes the well-known Wasserstein distance W1 to the case that the distributions are of unequal total mass.
The core of the method is based on a neural network to determine on optimal test function realizing the distance between two measures.
arXiv Detail & Related papers (2023-08-02T09:30:22Z) - Energy-Based Sliced Wasserstein Distance [47.18652387199418]
A key component of the sliced Wasserstein (SW) distance is the slicing distribution.
We propose to design the slicing distribution as an energy-based distribution that is parameter-free.
We then derive a novel sliced Wasserstein metric, energy-based sliced Waserstein (EBSW) distance.
arXiv Detail & Related papers (2023-04-26T14:28:45Z) - Near-optimal estimation of smooth transport maps with kernel
sums-of-squares [81.02564078640275]
Under smoothness conditions, the squared Wasserstein distance between two distributions could be efficiently computed with appealing statistical error upper bounds.
The object of interest for applications such as generative modeling is the underlying optimal transport map.
We propose the first tractable algorithm for which the statistical $L2$ error on the maps nearly matches the existing minimax lower-bounds for smooth map estimation.
arXiv Detail & Related papers (2021-12-03T13:45:36Z) - Depth-based pseudo-metrics between probability distributions [1.1470070927586016]
We propose two new pseudo-metrics between continuous probability measures based on data depth and its associated central regions.
In contrast to the Wasserstein distance, the proposed pseudo-metrics do not suffer from the curse of dimensionality.
The regions-based pseudo-metric appears to be robust w.r.t. both outliers and heavy tails.
arXiv Detail & Related papers (2021-03-23T17:33:18Z) - Two-sample Test using Projected Wasserstein Distance [18.46110328123008]
We develop a projected Wasserstein distance for the two-sample test, a fundamental problem in statistics and machine learning.
A key contribution is to couple optimal projection to find the low dimensional linear mapping to maximize the Wasserstein distance between projected probability distributions.
arXiv Detail & Related papers (2020-10-22T18:08:58Z) - The Unbalanced Gromov Wasserstein Distance: Conic Formulation and
Relaxation [0.0]
Comparing metric measure spaces (i.e. a metric space endowed with aprobability distribution) is at the heart of many machine learning problems.
The most popular distance between such metric spaces is the metric measureGro-Wasserstein (GW) distance of which is a quadratic.
The GW formulation alleviates the comparison of metric spaces equipped with arbitrary positive measures up to isometries.
arXiv Detail & Related papers (2020-09-09T12:38:14Z) - Linear Optimal Transport Embedding: Provable Wasserstein classification
for certain rigid transformations and perturbations [79.23797234241471]
Discriminating between distributions is an important problem in a number of scientific fields.
The Linear Optimal Transportation (LOT) embeds the space of distributions into an $L2$-space.
We demonstrate the benefits of LOT on a number of distribution classification problems.
arXiv Detail & Related papers (2020-08-20T19:09:33Z) - On Projection Robust Optimal Transport: Sample Complexity and Model
Misspecification [101.0377583883137]
Projection robust (PR) OT seeks to maximize the OT cost between two measures by choosing a $k$-dimensional subspace onto which they can be projected.
Our first contribution is to establish several fundamental statistical properties of PR Wasserstein distances.
Next, we propose the integral PR Wasserstein (IPRW) distance as an alternative to the PRW distance, by averaging rather than optimizing on subspaces.
arXiv Detail & Related papers (2020-06-22T14:35:33Z) - Fast and Robust Comparison of Probability Measures in Heterogeneous
Spaces [62.35667646858558]
We introduce the Anchor Energy (AE) and Anchor Wasserstein (AW) distances, which are respectively the energy and Wasserstein distances instantiated on such representations.
Our main contribution is to propose a sweep line algorithm to compute AE emphexactly in log-quadratic time, where a naive implementation would be cubic.
We show that AE and AW perform well in various experimental settings at a fraction of the computational cost of popular GW approximations.
arXiv Detail & Related papers (2020-02-05T03:09:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.