Jacobian Determinant of Normalizing Flows
- URL: http://arxiv.org/abs/2102.06539v1
- Date: Fri, 12 Feb 2021 14:09:28 GMT
- Title: Jacobian Determinant of Normalizing Flows
- Authors: Huadong Liao and Jiawei He
- Abstract summary: Normalizing flows learn a diffeomorphic mapping between the target and base distribution.
Jacobian determinant of that mapping forms another real-valued function.
To stabilize normalizing flows training, it is required to maintain a balance between the expansiveness and contraction of volume.
- Score: 7.124391555099448
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Normalizing flows learn a diffeomorphic mapping between the target and base
distribution, while the Jacobian determinant of that mapping forms another
real-valued function. In this paper, we show that the Jacobian determinant
mapping is unique for the given distributions, hence the likelihood objective
of flows has a unique global optimum. In particular, the likelihood for a class
of flows is explicitly expressed by the eigenvalues of the auto-correlation
matrix of individual data point, and independent of the parameterization of
neural network, which provides a theoretical optimal value of likelihood
objective and relates to probabilistic PCA. Additionally, Jacobian determinant
is a measure of local volume change and is maximized when MLE is used for
optimization. To stabilize normalizing flows training, it is required to
maintain a balance between the expansiveness and contraction of volume, meaning
Lipschitz constraint on the diffeomorphic mapping and its inverse. With these
theoretical results, several principles of designing normalizing flow were
proposed. And numerical experiments on highdimensional datasets (such as
CelebA-HQ 1024x1024) were conducted to show the improved stability of training.
Related papers
- Injective flows for star-like manifolds [1.4623202528810306]
We show that we can compute the Jacobian determinant exactly and efficiently, with the same cost as NFs, for star-like manifold densities.
This is particularly relevant for variational inference settings, where no samples are available and only some unnormalized target is known.
arXiv Detail & Related papers (2024-06-13T13:43:59Z) - Convex Parameter Estimation of Perturbed Multivariate Generalized
Gaussian Distributions [18.95928707619676]
We propose a convex formulation with well-established properties for MGGD parameters.
The proposed framework is flexible as it combines a variety of regularizations for the precision matrix, the mean and perturbations.
Experiments show a more accurate precision and covariance matrix estimation with similar performance for the mean vector parameter.
arXiv Detail & Related papers (2023-12-12T18:08:04Z) - Bayesian Renormalization [68.8204255655161]
We present a fully information theoretic approach to renormalization inspired by Bayesian statistical inference.
The main insight of Bayesian Renormalization is that the Fisher metric defines a correlation length that plays the role of an emergent RG scale.
We provide insight into how the Bayesian Renormalization scheme relates to existing methods for data compression and data generation.
arXiv Detail & Related papers (2023-05-17T18:00:28Z) - Variational Microcanonical Estimator [0.0]
We propose a variational quantum algorithm for estimating microcanonical expectation values in models obeying the eigenstate thermalization hypothesis.
An ensemble of variational states is then used to estimate microcanonical averages of local operators.
arXiv Detail & Related papers (2023-01-10T18:53:24Z) - Nonlinear Isometric Manifold Learning for Injective Normalizing Flows [58.720142291102135]
We use isometries to separate manifold learning and density estimation.
We also employ autoencoders to design embeddings with explicit inverses that do not distort the probability distribution.
arXiv Detail & Related papers (2022-03-08T08:57:43Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - Optimal regularizations for data generation with probabilistic graphical
models [0.0]
Empirically, well-chosen regularization schemes dramatically improve the quality of the inferred models.
We consider the particular case of L 2 and L 1 regularizations in the Maximum A Posteriori (MAP) inference of generative pairwise graphical models.
arXiv Detail & Related papers (2021-12-02T14:45:16Z) - Understanding Implicit Regularization in Over-Parameterized Single Index
Model [55.41685740015095]
We design regularization-free algorithms for the high-dimensional single index model.
We provide theoretical guarantees for the induced implicit regularization phenomenon.
arXiv Detail & Related papers (2020-07-16T13:27:47Z) - SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows [78.77808270452974]
SurVAE Flows is a modular framework for composable transformations that encompasses VAEs and normalizing flows.
We show that several recently proposed methods, including dequantization and augmented normalizing flows, can be expressed as SurVAE Flows.
arXiv Detail & Related papers (2020-07-06T13:13:22Z) - Asymptotic Analysis of an Ensemble of Randomly Projected Linear
Discriminants [94.46276668068327]
In [1], an ensemble of randomly projected linear discriminants is used to classify datasets.
We develop a consistent estimator of the misclassification probability as an alternative to the computationally-costly cross-validation estimator.
We also demonstrate the use of our estimator for tuning the projection dimension on both real and synthetic data.
arXiv Detail & Related papers (2020-04-17T12:47:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.