Equivariant Discrete Normalizing Flows
- URL: http://arxiv.org/abs/2110.08649v1
- Date: Sat, 16 Oct 2021 20:16:00 GMT
- Title: Equivariant Discrete Normalizing Flows
- Authors: Avishek Joey Bose and Ivan Kobyzev
- Abstract summary: We focus on building equivariant normalizing flows using discrete layers.
We introduce two new equivariant flows: $G$-coupling Flows and $G$-Residual Flows.
Our construction of $G$-Residual Flows are also universal, in the sense that we prove an $G$-equivariant diffeomorphism can be exactly mapped by a $G$-residual flow.
- Score: 10.867162810786361
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: At its core, generative modeling seeks to uncover the underlying factors that
give rise to observed data that can often be modelled as the natural symmetries
that manifest themselves through invariances and equivariances to certain
transformations laws. However, current approaches are couched in the formalism
of continuous normalizing flows that require the construction of equivariant
vector fields -- inhibiting their simple application to conventional higher
dimensional generative modelling domains like natural images. In this paper we
focus on building equivariant normalizing flows using discrete layers. We first
theoretically prove the existence of an equivariant map for compact groups
whose actions are on compact spaces. We further introduce two new equivariant
flows: $G$-coupling Flows and $G$-Residual Flows that elevate classical
Coupling and Residual Flows with equivariant maps to a prescribed group $G$.
Our construction of $G$-Residual Flows are also universal, in the sense that we
prove an $G$-equivariant diffeomorphism can be exactly mapped by a $G$-residual
flow. Finally, we complement our theoretical insights with experiments -- for
the first time -- on image datasets like CIFAR-10 and show $G$-Equivariant
Discrete Normalizing flows lead to increased data efficiency, faster
convergence, and improved likelihood estimates.
Related papers
- Unsupervised Representation Learning from Sparse Transformation Analysis [79.94858534887801]
We propose to learn representations from sequence data by factorizing the transformations of the latent variables into sparse components.
Input data are first encoded as distributions of latent activations and subsequently transformed using a probability flow model.
arXiv Detail & Related papers (2024-10-07T23:53:25Z) - Flow matching achieves almost minimax optimal convergence [50.38891696297888]
Flow matching (FM) has gained significant attention as a simulation-free generative model.
This paper discusses the convergence properties of FM for large sample size under the $p$-Wasserstein distance.
We establish that FM can achieve an almost minimax optimal convergence rate for $1 leq p leq 2$, presenting the first theoretical evidence that FM can reach convergence rates comparable to those of diffusion models.
arXiv Detail & Related papers (2024-05-31T14:54:51Z) - Equivariant Manifold Neural ODEs and Differential Invariants [1.6073704837297416]
We develop a manifestly geometric framework for equivariant manifold neural ordinary differential equations (NODEs)
We use it to analyse their modelling capabilities for symmetric data.
arXiv Detail & Related papers (2024-01-25T12:23:22Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - Attentive Contractive Flow with Lipschitz-constrained Self-Attention [25.84621883831624]
We introduce a novel approach called Attentive Contractive Flow (ACF)
ACF utilizes a special category of flow-based generative models - contractive flows.
We demonstrate that ACF can be introduced into a variety of state of the art flow models in a plug-and-play manner.
arXiv Detail & Related papers (2021-09-24T18:02:49Z) - Moser Flow: Divergence-based Generative Modeling on Manifolds [49.04974733536027]
Moser Flow (MF) is a new class of generative models within the family of continuous normalizing flows (CNF)
MF does not require invoking or backpropagating through an ODE solver during training.
We demonstrate for the first time the use of flow models for sampling from general curved surfaces.
arXiv Detail & Related papers (2021-08-18T09:00:24Z) - SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows [78.77808270452974]
SurVAE Flows is a modular framework for composable transformations that encompasses VAEs and normalizing flows.
We show that several recently proposed methods, including dequantization and augmented normalizing flows, can be expressed as SurVAE Flows.
arXiv Detail & Related papers (2020-07-06T13:13:22Z) - Normalizing Flows Across Dimensions [10.21537170623373]
We introduce noisy injective flows (NIF), a generalization of normalizing flows that can go across dimensions.
NIF explicitly map the latent space to a learnable manifold in a high-dimensional data space using injective transformations.
Empirically, we demonstrate that a simple application of our method to existing flow architectures can significantly improve sample quality and yield separable data embeddings.
arXiv Detail & Related papers (2020-06-23T14:47:18Z) - Neural Manifold Ordinary Differential Equations [46.25832801867149]
We introduce Neural Manifold Ordinary Differential Equations, which enables the construction of Manifold Continuous Normalizing Flows (MCNFs)
MCNFs require only local geometry and compute probabilities with continuous change of variables.
We find that leveraging continuous manifold dynamics produces a marked improvement for both density estimation and downstream tasks.
arXiv Detail & Related papers (2020-06-18T03:24:58Z) - Gaussianization Flows [113.79542218282282]
We propose a new type of normalizing flow model that enables both efficient iteration of likelihoods and efficient inversion for sample generation.
Because of this guaranteed expressivity, they can capture multimodal target distributions without compromising the efficiency of sample generation.
arXiv Detail & Related papers (2020-03-04T08:15:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.