An Empirical Comparison of GANs and Normalizing Flows for Density
Estimation
- URL: http://arxiv.org/abs/2006.10175v2
- Date: Tue, 14 Dec 2021 16:27:31 GMT
- Title: An Empirical Comparison of GANs and Normalizing Flows for Density
Estimation
- Authors: Tianci Liu, Jeffrey Regier
- Abstract summary: Generative adversarial networks (GANs) and normalizing flows are approaches to density estimation that use deep neural networks.
GANs and normalizing flows have seldom been compared to each other for modeling non-image data.
No GAN is capable of modeling our simple low-dimensional data well, a task we view as a prerequisite for an approach to be considered suitable for general-purpose statistical modeling.
- Score: 5.837881923712393
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Generative adversarial networks (GANs) and normalizing flows are both
approaches to density estimation that use deep neural networks to transform
samples from an uninformative prior distribution to an approximation of the
data distribution. There is great interest in both for general-purpose
statistical modeling, but the two approaches have seldom been compared to each
other for modeling non-image data. The difficulty of computing likelihoods with
GANs, which are implicit models, makes conducting such a comparison
challenging. We work around this difficulty by considering several
low-dimensional synthetic datasets. An extensive grid search over GAN
architectures, hyperparameters, and training procedures suggests that no GAN is
capable of modeling our simple low-dimensional data well, a task we view as a
prerequisite for an approach to be considered suitable for general-purpose
statistical modeling. Several normalizing flows, on the other hand, excelled at
these tasks, even substantially outperforming WGAN in terms of Wasserstein
distance -- the metric that WGAN alone targets. Scientists and other
practitioners should be wary of relying on WGAN for applications that require
accurate density estimation.
Related papers
- Savage-Dickey density ratio estimation with normalizing flows for Bayesian model comparison [4.232577149837663]
We use the Savage-Dickey density ratio to calculate the Bayes factor (evidence ratio) between two nested models.<n>We introduce a neural SDDR approach using normalizing flows that can scale to settings where the super model contains a large number of extra parameters.<n>For a field-level inference setting, we show that Bayes factors computed for a Bayesian hierarchical model and simulation-based inference ( SBI) approach are consistent.
arXiv Detail & Related papers (2025-06-04T18:00:24Z) - On the Statistical Properties of Generative Adversarial Models for Low
Intrinsic Data Dimension [38.964624328622]
We derive statistical guarantees on the estimated densities in terms of the intrinsic dimension of the data and the latent space.
We demonstrate that GANs can effectively achieve the minimax optimal rate even for non-smooth underlying distributions.
arXiv Detail & Related papers (2024-01-28T23:18:10Z) - Discrete Diffusion Modeling by Estimating the Ratios of the Data Distribution [67.9215891673174]
We propose score entropy as a novel loss that naturally extends score matching to discrete spaces.
We test our Score Entropy Discrete Diffusion models on standard language modeling tasks.
arXiv Detail & Related papers (2023-10-25T17:59:12Z) - RGM: A Robust Generalizable Matching Model [49.60975442871967]
We propose a deep model for sparse and dense matching, termed RGM (Robust Generalist Matching)
To narrow the gap between synthetic training samples and real-world scenarios, we build a new, large-scale dataset with sparse correspondence ground truth.
We are able to mix up various dense and sparse matching datasets, significantly improving the training diversity.
arXiv Detail & Related papers (2023-10-18T07:30:08Z) - Adversarial Likelihood Estimation With One-Way Flows [44.684952377918904]
Generative Adversarial Networks (GANs) can produce high-quality samples, but do not provide an estimate of the probability density around the samples.
We show that our method converges faster, produces comparable sample quality to GANs with similar architecture, successfully avoids over-fitting to commonly used datasets and produces smooth low-dimensional latent representations of the training data.
arXiv Detail & Related papers (2023-07-19T10:26:29Z) - How Much is Enough? A Study on Diffusion Times in Score-based Generative
Models [76.76860707897413]
Current best practice advocates for a large T to ensure that the forward dynamics brings the diffusion sufficiently close to a known and simple noise distribution.
We show how an auxiliary model can be used to bridge the gap between the ideal and the simulated forward dynamics, followed by a standard reverse diffusion process.
arXiv Detail & Related papers (2022-06-10T15:09:46Z) - Smooth densities and generative modeling with unsupervised random
forests [1.433758865948252]
An important application for density estimators is synthetic data generation.
We propose a new method based on unsupervised random forests for estimating smooth densities in arbitrary dimensions without parametric constraints.
We prove the consistency of our approach and demonstrate its advantages over existing tree-based density estimators.
arXiv Detail & Related papers (2022-05-19T09:50:25Z) - Latent Space Model for Higher-order Networks and Generalized Tensor
Decomposition [18.07071669486882]
We introduce a unified framework, formulated as general latent space models, to study complex higher-order network interactions.
We formulate the relationship between the latent positions and the observed data via a generalized multilinear kernel as the link function.
We demonstrate the effectiveness of our method on synthetic data.
arXiv Detail & Related papers (2021-06-30T13:11:17Z) - HGAN: Hybrid Generative Adversarial Network [25.940501417539416]
We propose a hybrid generative adversarial network (HGAN) for which we can enforce data density estimation via an autoregressive model.
A novel deep architecture within the GAN formulation is developed to adversarially distill the autoregressive model information in addition to simple GAN training approach.
arXiv Detail & Related papers (2021-02-07T03:54:12Z) - Generative Adversarial Networks (GANs): An Overview of Theoretical
Model, Evaluation Metrics, and Recent Developments [9.023847175654602]
Generative Adversarial Network (GAN) is an effective method to produce samples of large-scale data distribution.
GANs provide an appropriate way to learn deep representations without widespread use of labeled training data.
In GANs, the generative model is estimated via a competitive process where the generator and discriminator networks are trained simultaneously.
arXiv Detail & Related papers (2020-05-27T05:56:53Z) - TraDE: Transformers for Density Estimation [101.20137732920718]
TraDE is a self-attention-based architecture for auto-regressive density estimation.
We present a suite of tasks such as regression using generated samples, out-of-distribution detection, and robustness to noise in the training data.
arXiv Detail & Related papers (2020-04-06T07:32:51Z) - Discriminator Contrastive Divergence: Semi-Amortized Generative Modeling
by Exploring Energy of the Discriminator [85.68825725223873]
Generative Adversarial Networks (GANs) have shown great promise in modeling high dimensional data.
We introduce the Discriminator Contrastive Divergence, which is well motivated by the property of WGAN's discriminator.
We demonstrate the benefits of significant improved generation on both synthetic data and several real-world image generation benchmarks.
arXiv Detail & Related papers (2020-04-05T01:50:16Z) - GANs with Conditional Independence Graphs: On Subadditivity of
Probability Divergences [70.30467057209405]
Generative Adversarial Networks (GANs) are modern methods to learn the underlying distribution of a data set.
GANs are designed in a model-free fashion where no additional information about the underlying distribution is available.
We propose a principled design of a model-based GAN that uses a set of simple discriminators on the neighborhoods of the Bayes-net/MRF.
arXiv Detail & Related papers (2020-03-02T04:31:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.