Autoregressive Diffusion Models
- URL: http://arxiv.org/abs/2110.02037v1
- Date: Tue, 5 Oct 2021 13:36:55 GMT
- Title: Autoregressive Diffusion Models
- Authors: Emiel Hoogeboom and Alexey A. Gritsenko and Jasmijn Bastings and Ben
Poole and Rianne van den Berg and Tim Salimans
- Abstract summary: We introduce Autoregressive Diffusion Models (ARDMs), a model class encompassing and generalizing order-agnostic autoregressive models.
ARDMs are simple to implement and easy to train, and can be trained using an efficient objective similar to modern probabilistic diffusion models.
We show that ARDMs obtain compelling results not only on complete datasets, but also on compressing single data points.
- Score: 34.125045462636386
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce Autoregressive Diffusion Models (ARDMs), a model class
encompassing and generalizing order-agnostic autoregressive models (Uria et
al., 2014) and absorbing discrete diffusion (Austin et al., 2021), which we
show are special cases of ARDMs under mild assumptions. ARDMs are simple to
implement and easy to train. Unlike standard ARMs, they do not require causal
masking of model representations, and can be trained using an efficient
objective similar to modern probabilistic diffusion models that scales
favourably to highly-dimensional data. At test time, ARDMs support parallel
generation which can be adapted to fit any given generation budget. We find
that ARDMs require significantly fewer steps than discrete diffusion models to
attain the same performance. Finally, we apply ARDMs to lossless compression,
and show that they are uniquely suited to this task. Contrary to existing
approaches based on bits-back coding, ARDMs obtain compelling results not only
on complete datasets, but also on compressing single data points. Moreover,
this can be done using a modest number of network calls for (de)compression due
to the model's adaptable parallel generation.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.