Wasserstein Generative Learning of Conditional Distribution
- URL: http://arxiv.org/abs/2112.10039v1
- Date: Sun, 19 Dec 2021 01:55:01 GMT
- Title: Wasserstein Generative Learning of Conditional Distribution
- Authors: Shiao Liu, Xingyu Zhou, Yuling Jiao and Jian Huang
- Abstract summary: We propose a Wasserstein generative approach to learning a conditional distribution.
We establish non-asymptotic error bound of the conditional sampling distribution generated by the proposed method.
- Score: 6.051520664893158
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Conditional distribution is a fundamental quantity for describing the
relationship between a response and a predictor. We propose a Wasserstein
generative approach to learning a conditional distribution. The proposed
approach uses a conditional generator to transform a known distribution to the
target conditional distribution. The conditional generator is estimated by
matching a joint distribution involving the conditional generator and the
target joint distribution, using the Wasserstein distance as the discrepancy
measure for these joint distributions. We establish non-asymptotic error bound
of the conditional sampling distribution generated by the proposed method and
show that it is able to mitigate the curse of dimensionality, assuming that the
data distribution is supported on a lower-dimensional set. We conduct numerical
experiments to validate proposed method and illustrate its applications to
conditional sample generation, nonparametric conditional density estimation,
prediction uncertainty quantification, bivariate response data, image
reconstruction and image generation.
Related papers
- Collaborative Heterogeneous Causal Inference Beyond Meta-analysis [68.4474531911361]
We propose a collaborative inverse propensity score estimator for causal inference with heterogeneous data.
Our method shows significant improvements over the methods based on meta-analysis when heterogeneity increases.
arXiv Detail & Related papers (2024-04-24T09:04:36Z) - Unveil Conditional Diffusion Models with Classifier-free Guidance: A Sharp Statistical Theory [87.00653989457834]
Conditional diffusion models serve as the foundation of modern image synthesis and find extensive application in fields like computational biology and reinforcement learning.
Despite the empirical success, theory of conditional diffusion models is largely missing.
This paper bridges the gap by presenting a sharp statistical theory of distribution estimation using conditional diffusion models.
arXiv Detail & Related papers (2024-03-18T17:08:24Z) - Uncertainty Quantification via Stable Distribution Propagation [60.065272548502]
We propose a new approach for propagating stable probability distributions through neural networks.
Our method is based on local linearization, which we show to be an optimal approximation in terms of total variation distance for the ReLU non-linearity.
arXiv Detail & Related papers (2024-02-13T09:40:19Z) - Deep conditional distribution learning via conditional Föllmer flow [3.227277661633986]
We introduce an ordinary differential equation (ODE) based deep generative method for learning conditional distributions, named Conditional F"ollmer Flow.
For effective implementation, we discretize the flow with Euler's method where we estimate the velocity field nonparametrically using a deep neural network.
arXiv Detail & Related papers (2024-02-02T14:52:10Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - Wasserstein Geodesic Generator for Conditional Distributions [25.436269587204293]
We propose a novel conditional generation algorithm where conditional distributions are fully characterized by a metric space defined by a statistical distance.
We employ optimal transport theory to propose the Wasserstein geodesic generator, a new conditional generator that learns the Wasserstein geodesic.
arXiv Detail & Related papers (2023-08-20T03:12:10Z) - Adaptive Annealed Importance Sampling with Constant Rate Progress [68.8204255655161]
Annealed Importance Sampling (AIS) synthesizes weighted samples from an intractable distribution.
We propose the Constant Rate AIS algorithm and its efficient implementation for $alpha$-divergences.
arXiv Detail & Related papers (2023-06-27T08:15:28Z) - Flow Away your Differences: Conditional Normalizing Flows as an
Improvement to Reweighting [0.0]
We present an alternative to reweighting techniques for modifying distributions to account for a desired change in an underlying conditional distribution.
We employ conditional normalizing flows to learn the full conditional probability distribution.
In our examples, this leads to a statistical precision up to three times greater than using reweighting techniques with identical sample sizes for the source and target distributions.
arXiv Detail & Related papers (2023-04-28T16:33:50Z) - Continuous and Distribution-free Probabilistic Wind Power Forecasting: A
Conditional Normalizing Flow Approach [1.684864188596015]
We present a data-driven approach for probabilistic wind power forecasting based on conditional normalizing flow (CNF)
In contrast with the existing, this approach is distribution-free (as for non-parametric and quantile-based approaches) and can directly yield continuous probability densities.
arXiv Detail & Related papers (2022-06-06T08:48:58Z) - Adversarial sampling of unknown and high-dimensional conditional
distributions [0.0]
In this paper the sampling method, as well as the inference of the underlying distribution, are handled with a data-driven method known as generative adversarial networks (GAN)
GAN trains two competing neural networks to produce a network that can effectively generate samples from the training set distribution.
It is shown that all the versions of the proposed algorithm effectively sample the target conditional distribution with minimal impact on the quality of the samples.
arXiv Detail & Related papers (2021-11-08T12:23:38Z) - CovarianceNet: Conditional Generative Model for Correct Covariance
Prediction in Human Motion Prediction [71.31516599226606]
We present a new method to correctly predict the uncertainty associated with the predicted distribution of future trajectories.
Our approach, CovariaceNet, is based on a Conditional Generative Model with Gaussian latent variables.
arXiv Detail & Related papers (2021-09-07T09:38:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.