Exact Statistical Inference for the Wasserstein Distance by Selective
Inference
- URL: http://arxiv.org/abs/2109.14206v1
- Date: Wed, 29 Sep 2021 06:16:50 GMT
- Title: Exact Statistical Inference for the Wasserstein Distance by Selective
Inference
- Authors: Vo Nguyen Le Duy, Ichiro Takeuchi
- Abstract summary: We propose an exact (non-asymptotic) inference method for the Wasserstein distance inspired by the concept of conditional Selective Inference (SI)
To our knowledge, this is the first method that can provide a valid confidence interval (CI) for the Wasserstein distance with finite-sample coverage guarantee.
We evaluate the performance of the proposed method on both synthetic and real-world datasets.
- Score: 20.309302270008146
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we study statistical inference for the Wasserstein distance,
which has attracted much attention and has been applied to various machine
learning tasks. Several studies have been proposed in the literature, but
almost all of them are based on asymptotic approximation and do not have
finite-sample validity. In this study, we propose an exact (non-asymptotic)
inference method for the Wasserstein distance inspired by the concept of
conditional Selective Inference (SI). To our knowledge, this is the first
method that can provide a valid confidence interval (CI) for the Wasserstein
distance with finite-sample coverage guarantee, which can be applied not only
to one-dimensional problems but also to multi-dimensional problems. We evaluate
the performance of the proposed method on both synthetic and real-world
datasets.
Related papers
- Statistical Inference for Temporal Difference Learning with Linear Function Approximation [62.69448336714418]
Temporal Difference (TD) learning, arguably the most widely used for policy evaluation, serves as a natural framework for this purpose.
In this paper, we study the consistency properties of TD learning with Polyak-Ruppert averaging and linear function approximation, and obtain three significant improvements over existing results.
arXiv Detail & Related papers (2024-10-21T15:34:44Z) - Learning Correspondence Uncertainty via Differentiable Nonlinear Least
Squares [47.83169780113135]
We propose a differentiable nonlinear least squares framework to account for uncertainty in relative pose estimation from feature correspondences.
We evaluate our approach on synthetic, as well as the KITTI and EuRoC real-world datasets.
arXiv Detail & Related papers (2023-05-16T15:21:09Z) - Online Statistical Inference for Nonlinear Stochastic Approximation with
Markovian Data [22.59079286063505]
We study the statistical inference of nonlinear approximation algorithms utilizing a single trajectory of Markovian data.
Our methodology has practical applications in various scenarios, such as Gradient Descent (SGD) on autoregressive data and asynchronous Q-Learning.
arXiv Detail & Related papers (2023-02-15T14:31:11Z) - Deep Learning Methods for Proximal Inference via Maximum Moment
Restriction [0.0]
We introduce a flexible and scalable method based on a deep neural network to estimate causal effects in the presence of unmeasured confounding.
Our method achieves state of the art performance on two well-established proximal inference benchmarks.
arXiv Detail & Related papers (2022-05-19T19:51:42Z) - Statistical Inference for the Dynamic Time Warping Distance, with
Application to Abnormal Time-Series Detection [29.195884642878422]
We study statistical inference on the similarity/distance between two time-series under uncertain environment.
We propose to employ the conditional selective inference framework, which enables us to derive a valid inference method on the DTW distance.
We evaluate the performance of the proposed inference method on both synthetic and real-world datasets.
arXiv Detail & Related papers (2022-02-14T10:28:51Z) - Controlling Wasserstein Distances by Kernel Norms with Application to
Compressive Statistical Learning [4.873362301533825]
This paper establishes some conditions under which the Wasserstein distance can be controlled by MMD norms.
Inspired by existing results in CSL, we introduce the H"older Lower Restricted Isometric Property and show that this property comes with interesting guarantees for compressive statistical learning.
arXiv Detail & Related papers (2021-12-01T11:19:25Z) - More Powerful Conditional Selective Inference for Generalized Lasso by
Parametric Programming [20.309302270008146]
Conditional selective inference (SI) has been studied intensively as a new statistical inference framework for data-driven hypotheses.
We propose a more powerful and general conditional SI method for a class of problems that can be converted into quadratic parametric programming.
arXiv Detail & Related papers (2021-05-11T10:12:00Z) - Continuous Wasserstein-2 Barycenter Estimation without Minimax
Optimization [94.18714844247766]
Wasserstein barycenters provide a geometric notion of the weighted average of probability measures based on optimal transport.
We present a scalable algorithm to compute Wasserstein-2 barycenters given sample access to the input measures.
arXiv Detail & Related papers (2021-02-02T21:01:13Z) - On Projection Robust Optimal Transport: Sample Complexity and Model
Misspecification [101.0377583883137]
Projection robust (PR) OT seeks to maximize the OT cost between two measures by choosing a $k$-dimensional subspace onto which they can be projected.
Our first contribution is to establish several fundamental statistical properties of PR Wasserstein distances.
Next, we propose the integral PR Wasserstein (IPRW) distance as an alternative to the PRW distance, by averaging rather than optimizing on subspaces.
arXiv Detail & Related papers (2020-06-22T14:35:33Z) - Projection Robust Wasserstein Distance and Riemannian Optimization [107.93250306339694]
We show that projection robustly solidstein (PRW) is a robust variant of Wasserstein projection (WPP)
This paper provides a first step into the computation of the PRW distance and provides the links between their theory and experiments on and real data.
arXiv Detail & Related papers (2020-06-12T20:40:22Z) - The empirical duality gap of constrained statistical learning [115.23598260228587]
We study the study of constrained statistical learning problems, the unconstrained version of which are at the core of virtually all modern information processing.
We propose to tackle the constrained statistical problem overcoming its infinite dimensionality, unknown distributions, and constraints by leveraging finite dimensional parameterizations, sample averages, and duality theory.
We demonstrate the effectiveness and usefulness of this constrained formulation in a fair learning application.
arXiv Detail & Related papers (2020-02-12T19:12:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.