Conditional Optimal Transport on Function Spaces
- URL: http://arxiv.org/abs/2311.05672v3
- Date: Tue, 6 Feb 2024 17:37:39 GMT
- Title: Conditional Optimal Transport on Function Spaces
- Authors: Bamdad Hosseini, Alexander W. Hsu, Amirhossein Taghvaei
- Abstract summary: We develop a theory of constrained optimal transport problems that describe block-triangular Monge maps.
This generalizes the theory of optimal triangular transport to separable infinite-dimensional function spaces with general cost functions.
We present numerical experiments that demonstrate the computational applicability of our theoretical results for amortized and likelihood-free inference of functional parameters.
- Score: 53.9025059364831
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a systematic study of conditional triangular transport maps in
function spaces from the perspective of optimal transportation and with a view
towards amortized Bayesian inference. More specifically, we develop a theory of
constrained optimal transport problems that describe block-triangular Monge
maps that characterize conditional measures along with their Kantorovich
relaxations. This generalizes the theory of optimal triangular transport to
separable infinite-dimensional function spaces with general cost functions. We
further tailor our results to the case of Bayesian inference problems and
obtain regularity estimates on the conditioning maps from the prior to the
posterior. Finally, we present numerical experiments that demonstrate the
computational applicability of our theoretical results for amortized and
likelihood-free inference of functional parameters.
Related papers
- Neural Optimal Transport with Lagrangian Costs [29.091068250865504]
We investigate the optimal transport problem between probability measures when the underlying cost function is understood to satisfy a Lagrangian cost.
Our contributions are of computational interest, where we demonstrate the ability to efficiently compute geodesics and amortize spline-based paths.
Unlike prior work, we also output the resulting Lagrangian optimal transport map without requiring an ODE solver.
arXiv Detail & Related papers (2024-06-01T03:34:00Z) - Kernel-based off-policy estimation without overlap: Instance optimality
beyond semiparametric efficiency [53.90687548731265]
We study optimal procedures for estimating a linear functional based on observational data.
For any convex and symmetric function class $mathcalF$, we derive a non-asymptotic local minimax bound on the mean-squared error.
arXiv Detail & Related papers (2023-01-16T02:57:37Z) - Computationally Efficient PAC RL in POMDPs with Latent Determinism and
Conditional Embeddings [97.12538243736705]
We study reinforcement learning with function approximation for large-scale Partially Observable Decision Processes (POMDPs)
Our algorithm provably scales to large-scale POMDPs.
arXiv Detail & Related papers (2022-06-24T05:13:35Z) - Neural Optimal Transport with General Cost Functionals [66.41953045707172]
We introduce a novel neural network-based algorithm to compute optimal transport plans for general cost functionals.
As an application, we construct a cost functional to map data distributions while preserving the class-wise structure.
arXiv Detail & Related papers (2022-05-30T20:00:19Z) - Reinforcement Learning from Partial Observation: Linear Function Approximation with Provable Sample Efficiency [111.83670279016599]
We study reinforcement learning for partially observed decision processes (POMDPs) with infinite observation and state spaces.
We make the first attempt at partial observability and function approximation for a class of POMDPs with a linear structure.
arXiv Detail & Related papers (2022-04-20T21:15:38Z) - GAN Estimation of Lipschitz Optimal Transport Maps [0.0]
This paper introduces the first statistically consistent estimator of the optimal transport map between two probability distributions, based on neural networks.
We demonstrate that, under regularity assumptions, the obtained generator converges uniformly to the optimal transport map as the sample size increases to infinity.
In contrast to previous work tackling either statistical guarantees or practicality, we provide an expressive and feasible estimator which paves way for optimal transport applications.
arXiv Detail & Related papers (2022-02-16T10:15:56Z) - Near-optimal estimation of smooth transport maps with kernel
sums-of-squares [81.02564078640275]
Under smoothness conditions, the squared Wasserstein distance between two distributions could be efficiently computed with appealing statistical error upper bounds.
The object of interest for applications such as generative modeling is the underlying optimal transport map.
We propose the first tractable algorithm for which the statistical $L2$ error on the maps nearly matches the existing minimax lower-bounds for smooth map estimation.
arXiv Detail & Related papers (2021-12-03T13:45:36Z) - Relaxation of optimal transport problem via strictly convex functions [0.0]
An optimal transport problem on finite spaces is a linear program.
Recently, a relaxation of the optimal transport problem via strictly convex functions, especially via the Kullback--Leibler divergence, sheds new light on data sciences.
This paper provides the mathematical foundations and an iterative process based on a gradient descent for the relaxed optimal transport problem via Bregman divergences.
arXiv Detail & Related papers (2021-02-15T04:32:13Z) - Functional optimal transport: map estimation and domain adaptation for
functional data [35.60475201744369]
We introduce a formulation of optimal transport problem for distributions on function spaces.
For numerous machine learning tasks, data can be naturally viewed as samples drawn from spaces of functions.
We develop an efficient algorithm for finding the transport map between functional domains.
arXiv Detail & Related papers (2021-02-07T19:29:28Z) - Conditional Sampling with Monotone GANs: from Generative Models to
Likelihood-Free Inference [4.913013713982677]
We present a novel framework for conditional sampling of probability measures, using block triangular transport maps.
We develop the theoretical foundations of block triangular transport in a Banach space setting.
We then introduce a computational approach, called monotone generative adversarial networks, to learn suitable block triangular maps.
arXiv Detail & Related papers (2020-06-11T19:15:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.