Normalizing Flows are Capable Generative Models
- URL: http://arxiv.org/abs/2412.06329v2
- Date: Tue, 10 Dec 2024 03:19:52 GMT
- Title: Normalizing Flows are Capable Generative Models
- Authors: Shuangfei Zhai, Ruixiang Zhang, Preetum Nakkiran, David Berthelot, Jiatao Gu, Huangjie Zheng, Tianrong Chen, Miguel Angel Bautista, Navdeep Jaitly, Josh Susskind,
- Abstract summary: TarFlow is a simple and scalable architecture that enables highly performant NF models.
It is straightforward to train end-to-end, and capable of directly modeling and generating pixels.
TarFlow sets new state-of-the-art results on likelihood estimation for images, beating the previous best methods by a large margin.
- Score: 48.31226028595099
- License:
- Abstract: Normalizing Flows (NFs) are likelihood-based models for continuous inputs. They have demonstrated promising results on both density estimation and generative modeling tasks, but have received relatively little attention in recent years. In this work, we demonstrate that NFs are more powerful than previously believed. We present TarFlow: a simple and scalable architecture that enables highly performant NF models. TarFlow can be thought of as a Transformer-based variant of Masked Autoregressive Flows (MAFs): it consists of a stack of autoregressive Transformer blocks on image patches, alternating the autoregression direction between layers. TarFlow is straightforward to train end-to-end, and capable of directly modeling and generating pixels. We also propose three key techniques to improve sample quality: Gaussian noise augmentation during training, a post training denoising procedure, and an effective guidance method for both class-conditional and unconditional settings. Putting these together, TarFlow sets new state-of-the-art results on likelihood estimation for images, beating the previous best methods by a large margin, and generates samples with quality and diversity comparable to diffusion models, for the first time with a stand-alone NF model. We make our code available at https://github.com/apple/ml-tarflow.
Related papers
- Derivative-Free Guidance in Continuous and Discrete Diffusion Models with Soft Value-Based Decoding [84.3224556294803]
Diffusion models excel at capturing the natural design spaces of images, molecules, DNA, RNA, and protein sequences.
We aim to optimize downstream reward functions while preserving the naturalness of these design spaces.
Our algorithm integrates soft value functions, which looks ahead to how intermediate noisy states lead to high rewards in the future.
arXiv Detail & Related papers (2024-08-15T16:47:59Z) - Boundary-aware Decoupled Flow Networks for Realistic Extreme Rescaling [49.215957313126324]
Recently developed generative methods, including invertible rescaling network (IRN) based and generative adversarial network (GAN) based methods, have demonstrated exceptional performance in image rescaling.
However, IRN-based methods tend to produce over-smoothed results, while GAN-based methods easily generate fake details.
We propose Boundary-aware Decoupled Flow Networks (BDFlow) to generate realistic and visually pleasing results.
arXiv Detail & Related papers (2024-05-05T14:05:33Z) - PaddingFlow: Improving Normalizing Flows with Padding-Dimensional Noise [4.762593660623934]
We propose PaddingFlow, a novel dequantization method, which improves normalizing flows with padding-dimensional noise.
We validate our method on the main benchmarks of unconditional density estimation.
The results show that PaddingFlow can perform better in all experiments in this paper.
arXiv Detail & Related papers (2024-03-13T03:28:39Z) - Poisson flow consistency models for low-dose CT image denoising [3.6218104434936658]
We introduce a novel image denoising technique which combines the flexibility afforded in Poisson flow generative models (PFGM)++ with the, high quality, single step sampling of consistency models.
Our results indicate that the added flexibility of tuning the hyper parameter D, the dimensionality of the augmentation variables in PFGM++, allows us to outperform consistency models.
arXiv Detail & Related papers (2024-02-13T01:39:56Z) - Guided Flows for Generative Modeling and Decision Making [55.42634941614435]
We show that Guided Flows significantly improves the sample quality in conditional image generation and zero-shot text synthesis-to-speech.
Notably, we are first to apply flow models for plan generation in the offline reinforcement learning setting ax speedup in compared to diffusion models.
arXiv Detail & Related papers (2023-11-22T15:07:59Z) - Improving and generalizing flow-based generative models with minibatch
optimal transport [90.01613198337833]
We introduce the generalized conditional flow matching (CFM) technique for continuous normalizing flows (CNFs)
CFM features a stable regression objective like that used to train the flow in diffusion models but enjoys the efficient inference of deterministic flow models.
A variant of our objective is optimal transport CFM (OT-CFM), which creates simpler flows that are more stable to train and lead to faster inference.
arXiv Detail & Related papers (2023-02-01T14:47:17Z) - SoftFlow: Probabilistic Framework for Normalizing Flow on Manifolds [15.476426879806134]
Flow-based generative models are composed of invertible transformations between two random variables of the same dimension.
In this paper, we propose SoftFlow, a probabilistic framework for training normalizing flows on manifold.
We experimentally show that SoftFlow can capture the innate structure of the manifold data and generate high-quality samples.
We apply the proposed framework to 3D point clouds to alleviate the difficulty of forming thin structures for flow-based models.
arXiv Detail & Related papers (2020-06-08T13:56:07Z) - Normalizing Flows with Multi-Scale Autoregressive Priors [131.895570212956]
We introduce channel-wise dependencies in their latent space through multi-scale autoregressive priors (mAR)
Our mAR prior for models with split coupling flow layers (mAR-SCF) can better capture dependencies in complex multimodal data.
We show that mAR-SCF allows for improved image generation quality, with gains in FID and Inception scores compared to state-of-the-art flow-based models.
arXiv Detail & Related papers (2020-04-08T09:07:11Z) - Closing the Dequantization Gap: PixelCNN as a Single-Layer Flow [16.41460104376002]
We introduce subset flows, a class of flows that can transform finite volumes and allow exact computation of likelihoods for discrete data.
We identify ordinal discrete autoregressive models, including WaveNets, PixelCNNs and Transformers, as single-layer flows.
We demonstrate state-of-the-art results on CIFAR-10 for flow models trained with dequantization.
arXiv Detail & Related papers (2020-02-06T22:58:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.