New Perspectives on Regularization and Computation in Optimal
Transport-Based Distributionally Robust Optimization
- URL: http://arxiv.org/abs/2303.03900v1
- Date: Tue, 7 Mar 2023 13:52:32 GMT
- Title: New Perspectives on Regularization and Computation in Optimal
Transport-Based Distributionally Robust Optimization
- Authors: Soroosh Shafieezadeh-Abadeh, Liviu Aolaritei, Florian D\"orfler,
Daniel Kuhn
- Abstract summary: We study optimal transport-based distributionally robust optimization problems where a fictitious adversary, often envisioned as nature, can choose the distribution of the uncertain problem parameters by a prescribed reference distribution at a finite transportation cost.
- Score: 8.564319625930892
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We study optimal transport-based distributionally robust optimization
problems where a fictitious adversary, often envisioned as nature, can choose
the distribution of the uncertain problem parameters by reshaping a prescribed
reference distribution at a finite transportation cost. In this framework, we
show that robustification is intimately related to various forms of variation
and Lipschitz regularization even if the transportation cost function fails to
be (some power of) a metric. We also derive conditions for the existence and
the computability of a Nash equilibrium between the decision-maker and nature,
and we demonstrate numerically that nature's Nash strategy can be viewed as a
distribution that is supported on remarkably deceptive adversarial samples.
Finally, we identify practically relevant classes of optimal transport-based
distributionally robust optimization problems that can be addressed with
efficient gradient descent algorithms even if the loss function or the
transportation cost function are nonconvex (but not both at the same time).
Related papers
- Generalization Bounds of Surrogate Policies for Combinatorial Optimization Problems [61.580419063416734]
A recent stream of structured learning approaches has improved the practical state of the art for a range of optimization problems.
The key idea is to exploit the statistical distribution over instances instead of dealing with instances separately.
In this article, we investigate methods that smooth the risk by perturbing the policy, which eases optimization and improves the generalization error.
arXiv Detail & Related papers (2024-07-24T12:00:30Z) - OTClean: Data Cleaning for Conditional Independence Violations using
Optimal Transport [51.6416022358349]
sys is a framework that harnesses optimal transport theory for data repair under Conditional Independence (CI) constraints.
We develop an iterative algorithm inspired by Sinkhorn's matrix scaling algorithm, which efficiently addresses high-dimensional and large-scale data.
arXiv Detail & Related papers (2024-03-04T18:23:55Z) - Learning to Optimize with Stochastic Dominance Constraints [103.26714928625582]
In this paper, we develop a simple yet efficient approach for the problem of comparing uncertain quantities.
We recast inner optimization in the Lagrangian as a learning problem for surrogate approximation, which bypasses apparent intractability.
The proposed light-SD demonstrates superior performance on several representative problems ranging from finance to supply chain management.
arXiv Detail & Related papers (2022-11-14T21:54:31Z) - Optimal Efficiency-Envy Trade-Off via Optimal Transport [33.85971515753188]
We consider the problem of allocating a distribution of items to $n$ recipients where each recipient has to be allocated a fixed, prespecified fraction of all items.
We show that this problem can be formulated as a variant of the semi-discrete optimal transport (OT) problem, whose solution structure in this case has a concise representation and a simple geometric interpretation.
arXiv Detail & Related papers (2022-09-25T00:39:43Z) - A Short and General Duality Proof for Wasserstein Distributionally Robust Optimization [11.034091190797671]
We present a general duality result for Wasserstein distributionally robust optimization that holds for any Kantorovich transport cost, measurable loss function, and nominal probability distribution.
We demonstrate that the interchangeability principle holds if and only if certain measurable projection and weak measurable selection conditions are satisfied.
arXiv Detail & Related papers (2022-04-30T22:49:01Z) - Rockafellian Relaxation and Stochastic Optimization under Perturbations [0.056247917037481096]
We develop an optimistic" framework based on Rockafellian relaxations in which optimization is conducted not only over the original decision space but also jointly with a choice of model.
The framework centers on the novel concepts of exact and limit-exact Rockafellians, with interpretations of negative'' regularization emerging in certain settings.
arXiv Detail & Related papers (2022-04-10T20:02:41Z) - Variational Refinement for Importance Sampling Using the Forward
Kullback-Leibler Divergence [77.06203118175335]
Variational Inference (VI) is a popular alternative to exact sampling in Bayesian inference.
Importance sampling (IS) is often used to fine-tune and de-bias the estimates of approximate Bayesian inference procedures.
We propose a novel combination of optimization and sampling techniques for approximate Bayesian inference.
arXiv Detail & Related papers (2021-06-30T11:00:24Z) - Robustifying Conditional Portfolio Decisions via Optimal Transport [19.30689697702845]
We propose a data-driven portfolio selection model that integrates side information, conditional estimation and robustness.
We show that the distributionally robust portfolio allocation with side information problem can be reformulated as a finite-dimensional optimization problem.
arXiv Detail & Related papers (2021-03-30T15:56:03Z) - Semi-Discrete Optimal Transport: Hardness, Regularization and Numerical
Solution [8.465228064780748]
We prove that computing the Wasserstein distance between a discrete probability measure supported on two points is already #P-hard.
We introduce a distributionally robust dual optimal transport problem whose objective function is smoothed with the most adverse disturbance distributions.
We show that smoothing the dual objective function is equivalent to regularizing the primal objective function.
arXiv Detail & Related papers (2021-03-10T18:53:59Z) - Comparing Probability Distributions with Conditional Transport [63.11403041984197]
We propose conditional transport (CT) as a new divergence and approximate it with the amortized CT (ACT) cost.
ACT amortizes the computation of its conditional transport plans and comes with unbiased sample gradients that are straightforward to compute.
On a wide variety of benchmark datasets generative modeling, substituting the default statistical distance of an existing generative adversarial network with ACT is shown to consistently improve the performance.
arXiv Detail & Related papers (2020-12-28T05:14:22Z) - Distributionally Robust Bayesian Optimization [121.71766171427433]
We present a novel distributionally robust Bayesian optimization algorithm (DRBO) for zeroth-order, noisy optimization.
Our algorithm provably obtains sub-linear robust regret in various settings.
We demonstrate the robust performance of our method on both synthetic and real-world benchmarks.
arXiv Detail & Related papers (2020-02-20T22:04:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.