The Convolution Exponential and Generalized Sylvester Flows
- URL: http://arxiv.org/abs/2006.01910v2
- Date: Mon, 26 Oct 2020 10:24:08 GMT
- Title: The Convolution Exponential and Generalized Sylvester Flows
- Authors: Emiel Hoogeboom, Victor Garcia Satorras, Jakub M. Tomczak, Max Welling
- Abstract summary: This paper introduces a new method to build linear flows, by taking the exponential of a linear transformation.
An important insight is that the exponential can be computed implicitly, which allows the use of convolutional layers.
We show that the convolution exponential outperforms other linear transformations in generative flows on CIFAR10.
- Score: 82.18442368078804
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper introduces a new method to build linear flows, by taking the
exponential of a linear transformation. This linear transformation does not
need to be invertible itself, and the exponential has the following desirable
properties: it is guaranteed to be invertible, its inverse is straightforward
to compute and the log Jacobian determinant is equal to the trace of the linear
transformation. An important insight is that the exponential can be computed
implicitly, which allows the use of convolutional layers. Using this insight,
we develop new invertible transformations named convolution exponentials and
graph convolution exponentials, which retain the equivariance of their
underlying transformations. In addition, we generalize Sylvester Flows and
propose Convolutional Sylvester Flows which are based on the generalization and
the convolution exponential as basis change. Empirically, we show that the
convolution exponential outperforms other linear transformations in generative
flows on CIFAR10 and the graph convolution exponential improves the performance
of graph normalizing flows. In addition, we show that Convolutional Sylvester
Flows improve performance over residual flows as a generative flow model
measured in log-likelihood.
Related papers
- Graph Transformers Dream of Electric Flow [72.06286909236827]
We show that the linear Transformer, when applied to graph data, can implement algorithms that solve canonical problems.
We present explicit weight configurations for implementing each such graph algorithm, and we bound the errors of the constructed Transformers by the errors of the underlying algorithms.
arXiv Detail & Related papers (2024-10-22T05:11:45Z) - Advective Diffusion Transformers for Topological Generalization in Graph
Learning [69.2894350228753]
We show how graph diffusion equations extrapolate and generalize in the presence of varying graph topologies.
We propose a novel graph encoder backbone, Advective Diffusion Transformer (ADiT), inspired by advective graph diffusion equations.
arXiv Detail & Related papers (2023-10-10T08:40:47Z) - Flowformer: Linearizing Transformers with Conservation Flows [77.25101425464773]
We linearize Transformers free from specific inductive biases based on the flow network theory.
By respectively conserving the incoming flow of sinks for source competition and the outgoing flow of sources for sink allocation, Flow-Attention inherently generates informative attentions.
arXiv Detail & Related papers (2022-02-13T08:44:10Z) - Generalized Optimization: A First Step Towards Category Theoretic
Learning Theory [1.52292571922932]
We generalize several optimization algorithms, including a straightforward generalization of gradient descent and a novel generalization of Newton's method.
We show that the transformation invariances of these algorithms are preserved.
We also show that we can express the change in loss of generalized descent with an inner product-like expression.
arXiv Detail & Related papers (2021-09-20T15:19:06Z) - Generative Flows with Matrix Exponential [25.888286821451562]
Generative flows models enjoy the properties of tractable exact likelihood and efficient sampling.
We incorporate matrix exponential into generative flows.
Our model achieves great performance on density estimation amongst generative flows models.
arXiv Detail & Related papers (2020-07-19T11:18:47Z) - SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows [78.77808270452974]
SurVAE Flows is a modular framework for composable transformations that encompasses VAEs and normalizing flows.
We show that several recently proposed methods, including dequantization and augmented normalizing flows, can be expressed as SurVAE Flows.
arXiv Detail & Related papers (2020-07-06T13:13:22Z) - Gaussianization Flows [113.79542218282282]
We propose a new type of normalizing flow model that enables both efficient iteration of likelihoods and efficient inversion for sample generation.
Because of this guaranteed expressivity, they can capture multimodal target distributions without compromising the efficiency of sample generation.
arXiv Detail & Related papers (2020-03-04T08:15:06Z) - Invertible Generative Modeling using Linear Rational Splines [11.510009152620666]
Normalizing flows attempt to model an arbitrary probability distribution through a set of invertible mappings.
The first flow designs used coupling layer mappings built upon affine transformations.
Intrepid piecewise functions as a replacement for affine transformations have attracted attention.
arXiv Detail & Related papers (2020-01-15T08:05:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.