Implicit Normalizing Flows
- URL: http://arxiv.org/abs/2103.09527v1
- Date: Wed, 17 Mar 2021 09:24:04 GMT
- Title: Implicit Normalizing Flows
- Authors: Cheng Lu, Jianfei Chen, Chongxuan Li, Qiuhao Wang, Jun Zhu
- Abstract summary: ImpFlows generalize normalizing flows by allowing the mapping to be implicitly defined by the roots of an equation.
We show that the function space of ImpFlow is strictly richer than that of ResFlows.
We propose a scalable algorithm to train and draw samples from ImpFlows.
- Score: 43.939289514978434
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Normalizing flows define a probability distribution by an explicit invertible
transformation $\boldsymbol{\mathbf{z}}=f(\boldsymbol{\mathbf{x}})$. In this
work, we present implicit normalizing flows (ImpFlows), which generalize
normalizing flows by allowing the mapping to be implicitly defined by the roots
of an equation $F(\boldsymbol{\mathbf{z}}, \boldsymbol{\mathbf{x}})=
\boldsymbol{\mathbf{0}}$. ImpFlows build on residual flows (ResFlows) with a
proper balance between expressiveness and tractability. Through theoretical
analysis, we show that the function space of ImpFlow is strictly richer than
that of ResFlows. Furthermore, for any ResFlow with a fixed number of blocks,
there exists some function that ResFlow has a non-negligible approximation
error. However, the function is exactly representable by a single-block
ImpFlow. We propose a scalable algorithm to train and draw samples from
ImpFlows. Empirically, we evaluate ImpFlow on several classification and
density modeling tasks, and ImpFlow outperforms ResFlow with a comparable
amount of parameters on all the benchmarks.
Related papers
- Variational Flow Matching for Graph Generation [42.3778673162256]
We develop CatFlow, a flow matching method for categorical data.
CatFlow is easy to implement, computationally efficient, and achieves strong results on graph generation tasks.
arXiv Detail & Related papers (2024-06-07T11:16:17Z) - Boundary-aware Decoupled Flow Networks for Realistic Extreme Rescaling [49.215957313126324]
Recently developed generative methods, including invertible rescaling network (IRN) based and generative adversarial network (GAN) based methods, have demonstrated exceptional performance in image rescaling.
However, IRN-based methods tend to produce over-smoothed results, while GAN-based methods easily generate fake details.
We propose Boundary-aware Decoupled Flow Networks (BDFlow) to generate realistic and visually pleasing results.
arXiv Detail & Related papers (2024-05-05T14:05:33Z) - PaddingFlow: Improving Normalizing Flows with Padding-Dimensional Noise [4.762593660623934]
We propose PaddingFlow, a novel dequantization method, which improves normalizing flows with padding-dimensional noise.
We validate our method on the main benchmarks of unconditional density estimation.
The results show that PaddingFlow can perform better in all experiments in this paper.
arXiv Detail & Related papers (2024-03-13T03:28:39Z) - Expected flow networks in stochastic environments and two-player zero-sum games [63.98522423072093]
Generative flow networks (GFlowNets) are sequential sampling models trained to match a given distribution.
We propose expected flow networks (EFlowNets) which extend GFlowNets to environments.
We show that EFlowNets outperform other GFlowNet formulations in tasks such as protein design.
We then extend the concept of EFlowNets to adversarial environments, proposing adversarial flow networks (AFlowNets) for two-player zero-sum games.
arXiv Detail & Related papers (2023-10-04T12:50:29Z) - Towards Understanding and Improving GFlowNet Training [71.85707593318297]
We introduce an efficient evaluation strategy to compare the learned sampling distribution to the target reward distribution.
We propose prioritized replay training of high-reward $x$, relative edge flow policy parametrization, and a novel guided trajectory balance objective.
arXiv Detail & Related papers (2023-05-11T22:50:41Z) - Better Training of GFlowNets with Local Credit and Incomplete
Trajectories [81.14310509871935]
We consider the case where the energy function can be applied not just to terminal states but also to intermediate states.
This is for example achieved when the energy function is additive, with terms available along the trajectory.
This enables a training objective that can be applied to update parameters even with incomplete trajectories.
arXiv Detail & Related papers (2023-02-03T12:19:42Z) - GFlowNet Foundations [66.69854262276391]
Generative Flow Networks (GFlowNets) have been introduced as a method to sample a diverse set of candidates in an active learning context.
We show a number of additional theoretical properties of GFlowNets.
arXiv Detail & Related papers (2021-11-17T17:59:54Z) - ShapeFlow: Dynamic Shape Interpreter for TensorFlow [10.59840927423059]
We present ShapeFlow, a dynamic abstract interpreter for which quickly catches shape incompatibility errors.
ShapeFlow constructs a custom shape computational graph, similar to the computational graph used by the programmer.
We evaluate ShapeFlow on 52 programs collected by prior empirical studies to show how fast and accurately it can catch shape incompatibility errors.
arXiv Detail & Related papers (2020-11-26T19:27:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.