Bayesian Inference for Optimal Transport with Stochastic Cost
- URL: http://arxiv.org/abs/2010.09327v1
- Date: Mon, 19 Oct 2020 09:07:57 GMT
- Title: Bayesian Inference for Optimal Transport with Stochastic Cost
- Authors: Anton Mallasto, Markus Heinonen, Samuel Kaski
- Abstract summary: In machine learning and computer vision, optimal transport has had significant success in learning generative models.
We introduce a framework for inferring the optimal transport plan distribution by induced cost.
We also tailor an HMC method to sample from the resulting transport plan distribution.
- Score: 22.600086666266243
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In machine learning and computer vision, optimal transport has had
significant success in learning generative models and defining metric distances
between structured and stochastic data objects, that can be cast as probability
measures. The key element of optimal transport is the so called lifting of an
\emph{exact} cost (distance) function, defined on the sample space, to a cost
(distance) between probability measures over the sample space. However, in many
real life applications the cost is \emph{stochastic}: e.g., the unpredictable
traffic flow affects the cost of transportation between a factory and an
outlet. To take this stochasticity into account, we introduce a Bayesian
framework for inferring the optimal transport plan distribution induced by the
stochastic cost, allowing for a principled way to include prior information and
to model the induced stochasticity on the transport plans. Additionally, we
tailor an HMC method to sample from the resulting transport plan posterior
distribution.
Related papers
- Convex Physics Informed Neural Networks for the Monge-Ampère Optimal Transport Problem [49.1574468325115]
Optimal transportation of raw material from suppliers to customers is an issue arising in logistics.
A physics informed neuralnetwork method is advocated here for the solution of the corresponding generalized Monge-Ampere equation.
A particular focus is set on the enforcement of transport boundary conditions in the loss function.
arXiv Detail & Related papers (2025-01-17T12:51:25Z) - Enhanced Importance Sampling through Latent Space Exploration in Normalizing Flows [69.8873421870522]
importance sampling is a rare event simulation technique used in Monte Carlo simulations.
We propose a method for more efficient sampling by updating the proposal distribution in the latent space of a normalizing flow.
arXiv Detail & Related papers (2025-01-06T21:18:02Z) - Expected Sliced Transport Plans [9.33181953215826]
We propose a "lifting" operation to extend one-dimensional optimal transport plans back to the original space of the measures.
We prove that using the EST plan to weight the sum of the individual Euclidean costs for moving from one point to another results in a valid metric between the input discrete probability measures.
arXiv Detail & Related papers (2024-10-16T02:44:36Z) - Dynamical Measure Transport and Neural PDE Solvers for Sampling [77.38204731939273]
We tackle the task of sampling from a probability density as transporting a tractable density function to the target.
We employ physics-informed neural networks (PINNs) to approximate the respective partial differential equations (PDEs) solutions.
PINNs allow for simulation- and discretization-free optimization and can be trained very efficiently.
arXiv Detail & Related papers (2024-07-10T17:39:50Z) - Computing high-dimensional optimal transport by flow neural networks [22.320632565424745]
This work develops a flow-based model that transports from $P$ to an arbitrary $Q$ where both distributions are only accessible via finite samples.
We propose to learn the dynamic optimal transport between $P$ and $Q$ by training a flow neural network.
The trained optimal transport flow subsequently allows for performing many downstream tasks, including infinitesimal density estimation (DRE) and distribution in the latent space for generative models.
arXiv Detail & Related papers (2023-05-19T17:48:21Z) - New Perspectives on Regularization and Computation in Optimal Transport-Based Distributionally Robust Optimization [6.522972728187888]
We study optimal transport-based distributionally robust optimization problems where a fictitious adversary, often envisioned as nature, can choose the distribution of the uncertain problem parameters by a prescribed reference distribution at a finite transportation cost.
arXiv Detail & Related papers (2023-03-07T13:52:32Z) - InfoOT: Information Maximizing Optimal Transport [58.72713603244467]
InfoOT is an information-theoretic extension of optimal transport.
It maximizes the mutual information between domains while minimizing geometric distances.
This formulation yields a new projection method that is robust to outliers and generalizes to unseen samples.
arXiv Detail & Related papers (2022-10-06T18:55:41Z) - Neural Optimal Transport with General Cost Functionals [66.41953045707172]
We introduce a novel neural network-based algorithm to compute optimal transport plans for general cost functionals.
As an application, we construct a cost functional to map data distributions while preserving the class-wise structure.
arXiv Detail & Related papers (2022-05-30T20:00:19Z) - Predicting the probability distribution of bus travel time to move
towards reliable planning of public transport services [4.913013713982677]
We introduce a reliable approach to one of the problems of service planning in public transport, namely the Multiple Depot Vehicle Scheduling Problem (MDVSP)
This work empirically compares probabilistic models for the prediction of the conditional p.d.f. of the travel time, as a first step towards reliable MDVSP solutions.
arXiv Detail & Related papers (2021-02-03T21:05:37Z) - Comparing Probability Distributions with Conditional Transport [63.11403041984197]
We propose conditional transport (CT) as a new divergence and approximate it with the amortized CT (ACT) cost.
ACT amortizes the computation of its conditional transport plans and comes with unbiased sample gradients that are straightforward to compute.
On a wide variety of benchmark datasets generative modeling, substituting the default statistical distance of an existing generative adversarial network with ACT is shown to consistently improve the performance.
arXiv Detail & Related papers (2020-12-28T05:14:22Z) - Statistical Optimal Transport posed as Learning Kernel Embedding [0.0]
This work takes the novel approach of posing statistical Optimal Transport (OT) as that of learning the transport plan's kernel mean embedding from sample based estimates of marginal embeddings.
A key result is that, under very mild conditions, $epsilon$-optimal recovery of the transport plan as well as the Barycentric-projection based transport map is possible with a sample complexity that is completely dimension-free.
arXiv Detail & Related papers (2020-02-08T14:58:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.