A generative flow for conditional sampling via optimal transport
- URL: http://arxiv.org/abs/2307.04102v1
- Date: Sun, 9 Jul 2023 05:36:26 GMT
- Title: A generative flow for conditional sampling via optimal transport
- Authors: Jason Alfonso, Ricardo Baptista, Anupam Bhakta, Noam Gal, Alfin Hou,
Isa Lyubimova, Daniel Pocklington, Josef Sajonz, Giulio Trigila, and Ryan
Tsai
- Abstract summary: This work proposes a non-parametric generative model that iteratively maps reference samples to the target.
The model uses block-triangular transport maps, whose components are shown to characterize conditionals of the target distribution.
These maps arise from solving an optimal transport problem with a weighted $L2$ cost function, thereby extending the data-driven approach in [Trigila and Tabak, 2016] for conditional sampling.
- Score: 1.0486135378491266
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Sampling conditional distributions is a fundamental task for Bayesian
inference and density estimation. Generative models, such as normalizing flows
and generative adversarial networks, characterize conditional distributions by
learning a transport map that pushes forward a simple reference (e.g., a
standard Gaussian) to a target distribution. While these approaches
successfully describe many non-Gaussian problems, their performance is often
limited by parametric bias and the reliability of gradient-based (adversarial)
optimizers to learn these transformations. This work proposes a non-parametric
generative model that iteratively maps reference samples to the target. The
model uses block-triangular transport maps, whose components are shown to
characterize conditionals of the target distribution. These maps arise from
solving an optimal transport problem with a weighted $L^2$ cost function,
thereby extending the data-driven approach in [Trigila and Tabak, 2016] for
conditional sampling. The proposed approach is demonstrated on a two
dimensional example and on a parameter inference problem involving nonlinear
ODEs.
Related papers
- Theory on Score-Mismatched Diffusion Models and Zero-Shot Conditional Samplers [49.97755400231656]
We present the first performance guarantee with explicit dimensional general score-mismatched diffusion samplers.
We show that score mismatches result in an distributional bias between the target and sampling distributions, proportional to the accumulated mismatch between the target and training distributions.
This result can be directly applied to zero-shot conditional samplers for any conditional model, irrespective of measurement noise.
arXiv Detail & Related papers (2024-10-17T16:42:12Z) - von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - A transport approach to sequential simulation-based inference [0.0]
We present a new transport-based approach to efficiently perform sequential Bayesian inference of static model parameters.
The strategy is based on the extraction of conditional distribution from the joint distribution of parameters and data, via the estimation of structured (e.g., block triangular) transport maps.
This allow gradient-based characterizations of posterior density via transport maps in a model-free, online phase.
arXiv Detail & Related papers (2023-08-26T18:53:48Z) - Arbitrary Distributions Mapping via SyMOT-Flow: A Flow-based Approach Integrating Maximum Mean Discrepancy and Optimal Transport [2.7309692684728617]
We introduce a novel model called SyMOT-Flow that trains an invertible transformation by minimizing the symmetric maximum mean discrepancy between samples from two unknown distributions.
The resulting transformation leads to more stable and accurate sample generation.
arXiv Detail & Related papers (2023-08-26T08:39:16Z) - Inverse Models for Estimating the Initial Condition of Spatio-Temporal
Advection-Diffusion Processes [5.814371485767541]
Inverse problems involve making inference about unknown parameters of a physical process using observational data.
This paper investigates the estimation of the initial condition of a-temporal advection-diffusion process using spatially sparse data streams.
arXiv Detail & Related papers (2023-02-08T15:30:16Z) - Continuous and Distribution-free Probabilistic Wind Power Forecasting: A
Conditional Normalizing Flow Approach [1.684864188596015]
We present a data-driven approach for probabilistic wind power forecasting based on conditional normalizing flow (CNF)
In contrast with the existing, this approach is distribution-free (as for non-parametric and quantile-based approaches) and can directly yield continuous probability densities.
arXiv Detail & Related papers (2022-06-06T08:48:58Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - Resampling Base Distributions of Normalizing Flows [0.0]
We introduce a base distribution for normalizing flows based on learned rejection sampling.
We develop suitable learning algorithms using both maximizing the log-likelihood and the optimization of the reverse Kullback-Leibler divergence.
arXiv Detail & Related papers (2021-10-29T14:44:44Z) - Variational Transport: A Convergent Particle-BasedAlgorithm for Distributional Optimization [106.70006655990176]
A distributional optimization problem arises widely in machine learning and statistics.
We propose a novel particle-based algorithm, dubbed as variational transport, which approximately performs Wasserstein gradient descent.
We prove that when the objective function satisfies a functional version of the Polyak-Lojasiewicz (PL) (Polyak, 1963) and smoothness conditions, variational transport converges linearly.
arXiv Detail & Related papers (2020-12-21T18:33:13Z) - Pathwise Conditioning of Gaussian Processes [72.61885354624604]
Conventional approaches for simulating Gaussian process posteriors view samples as draws from marginal distributions of process values at finite sets of input locations.
This distribution-centric characterization leads to generative strategies that scale cubically in the size of the desired random vector.
We show how this pathwise interpretation of conditioning gives rise to a general family of approximations that lend themselves to efficiently sampling Gaussian process posteriors.
arXiv Detail & Related papers (2020-11-08T17:09:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.