VQ-Flows: Vector Quantized Local Normalizing Flows
- URL: http://arxiv.org/abs/2203.11556v1
- Date: Tue, 22 Mar 2022 09:22:18 GMT
- Title: VQ-Flows: Vector Quantized Local Normalizing Flows
- Authors: Sahil Sidheekh, Chris B. Dock, Tushar Jain, Radu Balan, Maneesh K.
Singh
- Abstract summary: We introduce a novel statistical framework for learning a mixture of local normalizing flows as "chart maps" over a data manifold.
Our framework augments the expressivity of recent approaches while preserving the signature property of normalizing flows, that they admit exact density evaluation.
- Score: 2.7998963147546148
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Normalizing flows provide an elegant approach to generative modeling that
allows for efficient sampling and exact density evaluation of unknown data
distributions. However, current techniques have significant limitations in
their expressivity when the data distribution is supported on a low-dimensional
manifold or has a non-trivial topology. We introduce a novel statistical
framework for learning a mixture of local normalizing flows as "chart maps"
over the data manifold. Our framework augments the expressivity of recent
approaches while preserving the signature property of normalizing flows, that
they admit exact density evaluation. We learn a suitable atlas of charts for
the data manifold via a vector quantized auto-encoder (VQ-AE) and the
distributions over them using a conditional flow. We validate experimentally
that our probabilistic framework enables existing approaches to better model
data distributions over complex manifolds.
Related papers
- Convolutional Filtering on Sampled Manifolds [122.06927400759021]
We show that convolutional filtering on a sampled manifold converges to continuous manifold filtering.
Our findings are further demonstrated empirically on a problem of navigation control.
arXiv Detail & Related papers (2022-11-20T19:09:50Z) - ManiFlow: Implicitly Representing Manifolds with Normalizing Flows [145.9820993054072]
Normalizing Flows (NFs) are flexible explicit generative models that have been shown to accurately model complex real-world data distributions.
We propose an optimization objective that recovers the most likely point on the manifold given a sample from the perturbed distribution.
Finally, we focus on 3D point clouds for which we utilize the explicit nature of NFs, i.e. surface normals extracted from the gradient of the log-likelihood and the log-likelihood itself.
arXiv Detail & Related papers (2022-08-18T16:07:59Z) - Nonlinear Isometric Manifold Learning for Injective Normalizing Flows [58.720142291102135]
We use isometries to separate manifold learning and density estimation.
We also employ autoencoders to design embeddings with explicit inverses that do not distort the probability distribution.
arXiv Detail & Related papers (2022-03-08T08:57:43Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - Tractable Density Estimation on Learned Manifolds with Conformal
Embedding Flows [0.0]
Normalizing flows provide tractable density estimation by transforming a simple base distribution into a complex target distribution.
Recent attempts to remedy this have introduced geometric complications that defeat a central benefit of normalizing flows: exact density estimation.
We argue that composing a standard flow with a trainable conformal embedding is the most natural way to model manifold-supported data.
arXiv Detail & Related papers (2021-06-09T18:00:00Z) - Generative Flows with Invertible Attentions [135.23766216657745]
We introduce two types of invertible attention mechanisms for generative flow models.
We exploit split-based attention mechanisms to learn the attention weights and input representations on every two splits of flow feature maps.
Our method provides invertible attention modules with tractable Jacobian determinants, enabling seamless integration of it at any positions of the flow-based models.
arXiv Detail & Related papers (2021-06-07T20:43:04Z) - SoftFlow: Probabilistic Framework for Normalizing Flow on Manifolds [15.476426879806134]
Flow-based generative models are composed of invertible transformations between two random variables of the same dimension.
In this paper, we propose SoftFlow, a probabilistic framework for training normalizing flows on manifold.
We experimentally show that SoftFlow can capture the innate structure of the manifold data and generate high-quality samples.
We apply the proposed framework to 3D point clouds to alleviate the difficulty of forming thin structures for flow-based models.
arXiv Detail & Related papers (2020-06-08T13:56:07Z) - Flows for simultaneous manifold learning and density estimation [12.451050883955071]
manifold-learning flows (M-flows) represent datasets with a manifold structure more faithfully.
M-flows learn the data manifold and allow for better inference than standard flows in the ambient data space.
arXiv Detail & Related papers (2020-03-31T02:07:48Z) - Semi-Supervised Learning with Normalizing Flows [54.376602201489995]
FlowGMM is an end-to-end approach to generative semi supervised learning with normalizing flows.
We show promising results on a wide range of applications, including AG-News and Yahoo Answers text data.
arXiv Detail & Related papers (2019-12-30T17:36:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.