Universal Approximation of Residual Flows in Maximum Mean Discrepancy
- URL: http://arxiv.org/abs/2103.05793v1
- Date: Wed, 10 Mar 2021 00:16:33 GMT
- Title: Universal Approximation of Residual Flows in Maximum Mean Discrepancy
- Authors: Zhifeng Kong, Kamalika Chaudhuri
- Abstract summary: We study residual flows, a class of normalizing flows composed of Lipschitz residual blocks.
We prove residual flows are universal approximators in maximum mean discrepancy.
- Score: 24.493721984271566
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Normalizing flows are a class of flexible deep generative models that offer
easy likelihood computation. Despite their empirical success, there is little
theoretical understanding of their expressiveness. In this work, we study
residual flows, a class of normalizing flows composed of Lipschitz residual
blocks. We prove residual flows are universal approximators in maximum mean
discrepancy. We provide upper bounds on the number of residual blocks to
achieve approximation under different assumptions.
Related papers
- On the Universality of Coupling-based Normalizing Flows [10.479969050570684]
We propose a distributional theorem for well-conditioned coupling-based normalizing flows such as RealNVP.
We show that volume-preserving normalizing flows are not universal, what distribution they learn instead, and how to fix their expressivity.
arXiv Detail & Related papers (2024-02-09T17:51:43Z) - Proximal Residual Flows for Bayesian Inverse Problems [0.0]
We introduce proximal residual flows, a new architecture of normalizing flows.
We ensure invertibility of certain residual blocks and extend the architecture to conditional residual flows.
We demonstrate the performance of proximal residual flows on numerical examples.
arXiv Detail & Related papers (2022-11-30T16:49:49Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - Resampling Base Distributions of Normalizing Flows [0.0]
We introduce a base distribution for normalizing flows based on learned rejection sampling.
We develop suitable learning algorithms using both maximizing the log-likelihood and the optimization of the reverse Kullback-Leibler divergence.
arXiv Detail & Related papers (2021-10-29T14:44:44Z) - On the expressivity of bi-Lipschitz normalizing flows [49.92565116246822]
An invertible function is bi-Lipschitz if both the function and its inverse have bounded Lipschitz constants.
Most Normalizing Flows are bi-Lipschitz by design or by training to limit numerical errors.
arXiv Detail & Related papers (2021-07-15T10:13:46Z) - Self Normalizing Flows [65.73510214694987]
We propose a flexible framework for training normalizing flows by replacing expensive terms in the gradient by learned approximate inverses at each layer.
This reduces the computational complexity of each layer's exact update from $mathcalO(D3)$ to $mathcalO(D2)$.
We show experimentally that such models are remarkably stable and optimize to similar data likelihood values as their exact gradient counterparts.
arXiv Detail & Related papers (2020-11-14T09:51:51Z) - Amortized Conditional Normalized Maximum Likelihood: Reliable Out of
Distribution Uncertainty Estimation [99.92568326314667]
We propose the amortized conditional normalized maximum likelihood (ACNML) method as a scalable general-purpose approach for uncertainty estimation.
Our algorithm builds on the conditional normalized maximum likelihood (CNML) coding scheme, which has minimax optimal properties according to the minimum description length principle.
We demonstrate that ACNML compares favorably to a number of prior techniques for uncertainty estimation in terms of calibration on out-of-distribution inputs.
arXiv Detail & Related papers (2020-11-05T08:04:34Z) - Quasi-Autoregressive Residual (QuAR) Flows [0.0]
We introduce a simplification to residual flows using a Quasi-Autoregressive (QuAR) approach.
Compared to the standard residual flow approach, this simplification retains many of the benefits of residual flows while dramatically reducing the compute time and memory requirements.
arXiv Detail & Related papers (2020-09-16T01:56:24Z) - SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows [78.77808270452974]
SurVAE Flows is a modular framework for composable transformations that encompasses VAEs and normalizing flows.
We show that several recently proposed methods, including dequantization and augmented normalizing flows, can be expressed as SurVAE Flows.
arXiv Detail & Related papers (2020-07-06T13:13:22Z) - Composing Normalizing Flows for Inverse Problems [89.06155049265641]
We propose a framework for approximate inference that estimates the target conditional as a composition of two flow models.
Our method is evaluated on a variety of inverse problems and is shown to produce high-quality samples with uncertainty.
arXiv Detail & Related papers (2020-02-26T19:01:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.