Private Wasserstein Distance
- URL: http://arxiv.org/abs/2404.06787v2
- Date: Sun, 02 Feb 2025 07:21:15 GMT
- Title: Private Wasserstein Distance
- Authors: Wenqian Li, Yan Pang,
- Abstract summary: Wasserstein distance is a key metric for quantifying data divergence from a distributional perspective.
In this study, we explore the inherent triangular properties within the Wasserstein space, leading to a novel solution named TriangleWad.
- Score: 6.015898117103069
- License:
- Abstract: Wasserstein distance is a key metric for quantifying data divergence from a distributional perspective. However, its application in privacy-sensitive environments, where direct sharing of raw data is prohibited, presents significant challenges. Existing approaches, such as Differential Privacy and Federated Optimization, have been employed to estimate the Wasserstein distance under such constraints. However, these methods often fall short when both accuracy and security are required. In this study, we explore the inherent triangular properties within the Wasserstein space, leading to a novel solution named TriangleWad. This approach facilitates the fast computation of the Wasserstein distance between datasets stored across different entities, ensuring that raw data remain completely hidden. TriangleWad not only strengthens resistance to potential attacks but also preserves high estimation accuracy. Through extensive experiments across various tasks involving both image and text data, we demonstrate its superior performance and significant potential for real-world applications.
Related papers
- Learning with Differentially Private (Sliced) Wasserstein Gradients [3.154269505086155]
We introduce a novel framework for privately optimizing objectives that rely on Wasserstein distances between data-dependent empirical measures.
Our main theoretical contribution is, based on an explicit formulation of the Wasserstein gradient in a fully discrete setting.
We develop a deep learning approach that incorporates gradient and activations clipping, originally designed for DP training of problems with a finite-sum structure.
arXiv Detail & Related papers (2025-02-03T09:14:26Z) - Reconsidering utility: unveiling the limitations of synthetic mobility data generation algorithms in real-life scenarios [49.1574468325115]
We evaluate the utility of five state-of-the-art synthesis approaches in terms of real-world applicability.
We focus on so-called trip data that encode fine granular urban movements such as GPS-tracked taxi rides.
One model fails to produce data within reasonable time and another generates too many jumps to meet the requirements for map matching.
arXiv Detail & Related papers (2024-07-03T16:08:05Z) - Federated Wasserstein Distance [16.892296712204597]
We introduce a principled way of computing the Wasserstein distance between two distributions in a federated manner.
We show how to estimate the Wasserstein distance between two samples stored and kept on different devices/clients whilst a central entity/server orchestrates the computations.
arXiv Detail & Related papers (2023-10-03T11:30:50Z) - Wasserstein Adversarial Examples on Univariant Time Series Data [23.15675721397447]
We propose adversarial examples in the Wasserstein space for time series data.
We use Wasserstein distance to bound the perturbation between normal examples and adversarial examples.
We empirically evaluate the proposed attack on several time series datasets in the healthcare domain.
arXiv Detail & Related papers (2023-03-22T07:50:15Z) - Mutual Wasserstein Discrepancy Minimization for Sequential
Recommendation [82.0801585843835]
We propose a novel self-supervised learning framework based on Mutual WasserStein discrepancy minimization MStein for the sequential recommendation.
We also propose a novel contrastive learning loss based on Wasserstein Discrepancy Measurement.
arXiv Detail & Related papers (2023-01-28T13:38:48Z) - Statistical, Robustness, and Computational Guarantees for Sliced
Wasserstein Distances [18.9717974398864]
Sliced Wasserstein distances preserve properties of classic Wasserstein distances while being more scalable for computation and estimation in high dimensions.
We quantify this scalability from three key aspects: (i) empirical convergence rates; (ii) robustness to data contamination; and (iii) efficient computational methods.
arXiv Detail & Related papers (2022-10-17T15:04:51Z) - Exact Statistical Inference for the Wasserstein Distance by Selective
Inference [20.309302270008146]
We propose an exact (non-asymptotic) inference method for the Wasserstein distance inspired by the concept of conditional Selective Inference (SI)
To our knowledge, this is the first method that can provide a valid confidence interval (CI) for the Wasserstein distance with finite-sample coverage guarantee.
We evaluate the performance of the proposed method on both synthetic and real-world datasets.
arXiv Detail & Related papers (2021-09-29T06:16:50Z) - Variable Skipping for Autoregressive Range Density Estimation [84.60428050170687]
We show a technique, variable skipping, for accelerating range density estimation over deep autoregressive models.
We show that variable skipping provides 10-100$times$ efficiency improvements when targeting challenging high-quantile error metrics.
arXiv Detail & Related papers (2020-07-10T19:01:40Z) - Learning while Respecting Privacy and Robustness to Distributional
Uncertainties and Adversarial Data [66.78671826743884]
The distributionally robust optimization framework is considered for training a parametric model.
The objective is to endow the trained model with robustness against adversarially manipulated input data.
Proposed algorithms offer robustness with little overhead.
arXiv Detail & Related papers (2020-07-07T18:25:25Z) - Augmented Sliced Wasserstein Distances [55.028065567756066]
We propose a new family of distance metrics, called augmented sliced Wasserstein distances (ASWDs)
ASWDs are constructed by first mapping samples to higher-dimensional hypersurfaces parameterized by neural networks.
Numerical results demonstrate that the ASWD significantly outperforms other Wasserstein variants for both synthetic and real-world problems.
arXiv Detail & Related papers (2020-06-15T23:00:08Z) - Projection Robust Wasserstein Distance and Riemannian Optimization [107.93250306339694]
We show that projection robustly solidstein (PRW) is a robust variant of Wasserstein projection (WPP)
This paper provides a first step into the computation of the PRW distance and provides the links between their theory and experiments on and real data.
arXiv Detail & Related papers (2020-06-12T20:40:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.