Wasserstein Distributionally Robust Optimization via Wasserstein
Barycenters
- URL: http://arxiv.org/abs/2203.12136v1
- Date: Wed, 23 Mar 2022 02:03:47 GMT
- Title: Wasserstein Distributionally Robust Optimization via Wasserstein
Barycenters
- Authors: Tim Tsz-Kit Lau, Han Liu
- Abstract summary: We seek data-driven decisions which perform well under the most adverse distribution from a nominal distribution constructed from data samples within a certain distance of probability distributions.
We propose constructing the nominal distribution in Wasserstein distributionally robust optimization problems through the notion of Wasserstein barycenter as an aggregation of data samples from multiple sources.
- Score: 10.103413548140848
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In many applications in statistics and machine learning, the availability of
data samples from multiple sources has become increasingly prevalent. On the
other hand, in distributionally robust optimization, we seek data-driven
decisions which perform well under the most adverse distribution from a nominal
distribution constructed from data samples within a certain distance of
probability distributions. However, it remains unclear how to achieve such
distributional robustness when data samples from multiple sources are
available. In this paper, we propose constructing the nominal distribution in
Wasserstein distributionally robust optimization problems through the notion of
Wasserstein barycenter as an aggregation of data samples from multiple sources.
Under specific choices of the loss function, the proposed formulation admits a
tractable reformulation as a finite convex program, with powerful finite-sample
and asymptotic guarantees. We illustrate our proposed method through concrete
examples with nominal distributions of location-scatter families and
distributionally robust maximum likelihood estimation.
Related papers
- Unified Convergence Analysis for Score-Based Diffusion Models with Deterministic Samplers [49.1574468325115]
We introduce a unified convergence analysis framework for deterministic samplers.
Our framework achieves iteration complexity of $tilde O(d2/epsilon)$.
We also provide a detailed analysis of Denoising Implicit Diffusion Models (DDIM)-type samplers.
arXiv Detail & Related papers (2024-10-18T07:37:36Z) - Theory on Score-Mismatched Diffusion Models and Zero-Shot Conditional Samplers [49.97755400231656]
We present the first performance guarantee with explicit dimensional general score-mismatched diffusion samplers.
We show that score mismatches result in an distributional bias between the target and sampling distributions, proportional to the accumulated mismatch between the target and training distributions.
This result can be directly applied to zero-shot conditional samplers for any conditional model, irrespective of measurement noise.
arXiv Detail & Related papers (2024-10-17T16:42:12Z) - Distributional Matrix Completion via Nearest Neighbors in the Wasserstein Space [8.971989179518216]
Given a sparsely observed matrix of empirical distributions, we seek to impute the true distributions associated with both observed and unobserved matrix entries.
We utilize tools from optimal transport to generalize the nearest neighbors method to the distributional setting.
arXiv Detail & Related papers (2024-10-17T00:50:17Z) - Sourcerer: Sample-based Maximum Entropy Source Distribution Estimation [5.673617376471343]
We propose an approach which targets the maximum entropy distribution, i.e., prioritizes retaining as much uncertainty as possible.
Our method is purely sample-based - leveraging the Sliced-Wasserstein distance to measure the discrepancy between the dataset and simulations.
To demonstrate the utility of our approach, we infer source distributions for parameters of the Hodgkin-Huxley model from experimental datasets with thousands of single-neuron measurements.
arXiv Detail & Related papers (2024-02-12T17:13:02Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - Two-Stage Robust and Sparse Distributed Statistical Inference for
Large-Scale Data [18.34490939288318]
We address the problem of conducting statistical inference in settings involving large-scale data that may be high-dimensional and contaminated by outliers.
We propose a two-stage distributed and robust statistical inference procedures coping with high-dimensional models by promoting sparsity.
arXiv Detail & Related papers (2022-08-17T11:17:47Z) - Robust Calibration with Multi-domain Temperature Scaling [86.07299013396059]
We develop a systematic calibration model to handle distribution shifts by leveraging data from multiple domains.
Our proposed method -- multi-domain temperature scaling -- uses the robustness in the domains to improve calibration under distribution shift.
arXiv Detail & Related papers (2022-06-06T17:32:12Z) - Data-Driven Approximations of Chance Constrained Programs in
Nonstationary Environments [3.126118485851773]
We study sample average approximations (SAA) of chance constrained programs.
We consider a nonstationary variant of this problem, where the random samples are assumed to be independently drawn in a sequential fashion.
We propose a novel robust SAA method exploiting information about the Wasserstein distance between the sequence of data-generating distributions and the actual chance constraint distribution.
arXiv Detail & Related papers (2022-05-08T01:01:57Z) - Wrapped Distributions on homogeneous Riemannian manifolds [58.720142291102135]
Control over distributions' properties, such as parameters, symmetry and modality yield a family of flexible distributions.
We empirically validate our approach by utilizing our proposed distributions within a variational autoencoder and a latent space network model.
arXiv Detail & Related papers (2022-04-20T21:25:21Z) - Stochastic Saddle-Point Optimization for Wasserstein Barycenters [69.68068088508505]
We consider the populationimation barycenter problem for random probability measures supported on a finite set of points and generated by an online stream of data.
We employ the structure of the problem and obtain a convex-concave saddle-point reformulation of this problem.
In the setting when the distribution of random probability measures is discrete, we propose an optimization algorithm and estimate its complexity.
arXiv Detail & Related papers (2020-06-11T19:40:38Z) - Wasserstein Distributionally Robust Optimization: Theory and Applications in Machine Learning [20.116219345579154]
Decision problems in science, engineering and economics are affected by uncertain parameters whose distribution is only indirectly observable through samples.
The goal of data-driven decision-making is to learn a decision from finitely many training samples that will perform well on unseen test samples.
We will show that Wasserstein distributionally robust optimization has interesting ramifications for statistical learning.
arXiv Detail & Related papers (2019-08-23T09:28:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.