Wasserstein Generative Learning of Conditional Distribution
- URL: http://arxiv.org/abs/2112.10039v1
- Date: Sun, 19 Dec 2021 01:55:01 GMT
- Title: Wasserstein Generative Learning of Conditional Distribution
- Authors: Shiao Liu, Xingyu Zhou, Yuling Jiao and Jian Huang
- Abstract summary: We propose a Wasserstein generative approach to learning a conditional distribution.
We establish non-asymptotic error bound of the conditional sampling distribution generated by the proposed method.
- Score: 6.051520664893158
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Conditional distribution is a fundamental quantity for describing the
relationship between a response and a predictor. We propose a Wasserstein
generative approach to learning a conditional distribution. The proposed
approach uses a conditional generator to transform a known distribution to the
target conditional distribution. The conditional generator is estimated by
matching a joint distribution involving the conditional generator and the
target joint distribution, using the Wasserstein distance as the discrepancy
measure for these joint distributions. We establish non-asymptotic error bound
of the conditional sampling distribution generated by the proposed method and
show that it is able to mitigate the curse of dimensionality, assuming that the
data distribution is supported on a lower-dimensional set. We conduct numerical
experiments to validate proposed method and illustrate its applications to
conditional sample generation, nonparametric conditional density estimation,
prediction uncertainty quantification, bivariate response data, image
reconstruction and image generation.
Related papers
- Theory on Score-Mismatched Diffusion Models and Zero-Shot Conditional Samplers [49.97755400231656]
We present the first performance guarantee with explicit dimensional general score-mismatched diffusion samplers.
We show that score mismatches result in an distributional bias between the target and sampling distributions, proportional to the accumulated mismatch between the target and training distributions.
This result can be directly applied to zero-shot conditional samplers for any conditional model, irrespective of measurement noise.
arXiv Detail & Related papers (2024-10-17T16:42:12Z) - Collaborative Heterogeneous Causal Inference Beyond Meta-analysis [68.4474531911361]
We propose a collaborative inverse propensity score estimator for causal inference with heterogeneous data.
Our method shows significant improvements over the methods based on meta-analysis when heterogeneity increases.
arXiv Detail & Related papers (2024-04-24T09:04:36Z) - Uncertainty Quantification via Stable Distribution Propagation [60.065272548502]
We propose a new approach for propagating stable probability distributions through neural networks.
Our method is based on local linearization, which we show to be an optimal approximation in terms of total variation distance for the ReLU non-linearity.
arXiv Detail & Related papers (2024-02-13T09:40:19Z) - Deep conditional distribution learning via conditional Föllmer flow [3.227277661633986]
We introduce an ordinary differential equation (ODE) based deep generative method for learning conditional distributions, named Conditional F"ollmer Flow.
For effective implementation, we discretize the flow with Euler's method where we estimate the velocity field nonparametrically using a deep neural network.
arXiv Detail & Related papers (2024-02-02T14:52:10Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - Conditional Stochastic Interpolation for Generative Learning [5.0061421661196865]
We propose a conditional conditional (CSI) method for learning conditional distributions.
We derive explicit expressions of the conditional drift and score functions in terms of conditional expectations.
We illustrate the application of CSI on image generation using a benchmark image dataset.
arXiv Detail & Related papers (2023-12-09T13:53:35Z) - Wasserstein Geodesic Generator for Conditional Distributions [25.436269587204293]
We propose a novel conditional generation algorithm where conditional distributions are fully characterized by a metric space defined by a statistical distance.
We employ optimal transport theory to propose the Wasserstein geodesic generator, a new conditional generator that learns the Wasserstein geodesic.
arXiv Detail & Related papers (2023-08-20T03:12:10Z) - Adaptive Annealed Importance Sampling with Constant Rate Progress [68.8204255655161]
Annealed Importance Sampling (AIS) synthesizes weighted samples from an intractable distribution.
We propose the Constant Rate AIS algorithm and its efficient implementation for $alpha$-divergences.
arXiv Detail & Related papers (2023-06-27T08:15:28Z) - Flow Away your Differences: Conditional Normalizing Flows as an
Improvement to Reweighting [0.0]
We present an alternative to reweighting techniques for modifying distributions to account for a desired change in an underlying conditional distribution.
We employ conditional normalizing flows to learn the full conditional probability distribution.
In our examples, this leads to a statistical precision up to three times greater than using reweighting techniques with identical sample sizes for the source and target distributions.
arXiv Detail & Related papers (2023-04-28T16:33:50Z) - CovarianceNet: Conditional Generative Model for Correct Covariance
Prediction in Human Motion Prediction [71.31516599226606]
We present a new method to correctly predict the uncertainty associated with the predicted distribution of future trajectories.
Our approach, CovariaceNet, is based on a Conditional Generative Model with Gaussian latent variables.
arXiv Detail & Related papers (2021-09-07T09:38:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.