Tractable Density Estimation on Learned Manifolds with Conformal
Embedding Flows
- URL: http://arxiv.org/abs/2106.05275v1
- Date: Wed, 9 Jun 2021 18:00:00 GMT
- Title: Tractable Density Estimation on Learned Manifolds with Conformal
Embedding Flows
- Authors: Brendan Leigh Ross, Jesse C. Cresswell
- Abstract summary: Normalizing flows provide tractable density estimation by transforming a simple base distribution into a complex target distribution.
Recent attempts to remedy this have introduced geometric complications that defeat a central benefit of normalizing flows: exact density estimation.
We argue that composing a standard flow with a trainable conformal embedding is the most natural way to model manifold-supported data.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Normalizing flows are generative models that provide tractable density
estimation by transforming a simple base distribution into a complex target
distribution. However, this technique cannot directly model data supported on
an unknown low-dimensional manifold, a common occurrence in real-world domains
such as image data. Recent attempts to remedy this limitation have introduced
geometric complications that defeat a central benefit of normalizing flows:
exact density estimation. We recover this benefit with Conformal Embedding
Flows, a framework for designing flows that learn manifolds with tractable
densities. We argue that composing a standard flow with a trainable conformal
embedding is the most natural way to model manifold-supported data. To this
end, we present a series of conformal building blocks and apply them in
experiments with real-world and synthetic data to demonstrate that flows can
model manifold-supported distributions without sacrificing tractable
likelihoods.
Related papers
- Out-of-distribution detection using normalizing flows on the data
manifold [3.725042082196983]
This study investigates the effect of manifold learning using normalizing flows on out-of-distribution detection.
We show that manifold learning improves the out-of-distribution detection ability of a class of likelihood-based models known as normalizing flows.
arXiv Detail & Related papers (2023-08-26T07:35:16Z) - Reflected Diffusion Models [93.26107023470979]
We present Reflected Diffusion Models, which reverse a reflected differential equation evolving on the support of the data.
Our approach learns the score function through a generalized score matching loss and extends key components of standard diffusion models.
arXiv Detail & Related papers (2023-04-10T17:54:38Z) - Convolutional Filtering on Sampled Manifolds [122.06927400759021]
We show that convolutional filtering on a sampled manifold converges to continuous manifold filtering.
Our findings are further demonstrated empirically on a problem of navigation control.
arXiv Detail & Related papers (2022-11-20T19:09:50Z) - ManiFlow: Implicitly Representing Manifolds with Normalizing Flows [145.9820993054072]
Normalizing Flows (NFs) are flexible explicit generative models that have been shown to accurately model complex real-world data distributions.
We propose an optimization objective that recovers the most likely point on the manifold given a sample from the perturbed distribution.
Finally, we focus on 3D point clouds for which we utilize the explicit nature of NFs, i.e. surface normals extracted from the gradient of the log-likelihood and the log-likelihood itself.
arXiv Detail & Related papers (2022-08-18T16:07:59Z) - Joint Manifold Learning and Density Estimation Using Normalizing Flows [4.939777212813711]
We introduce two approaches, namely per-pixel penalized log-likelihood and hierarchical training, to answer the question.
We propose a single-step method for joint manifold learning and density estimation by disentangling the transformed space.
Results validate the superiority of the proposed methods in simultaneous manifold learning and density estimation.
arXiv Detail & Related papers (2022-06-07T13:35:14Z) - VQ-Flows: Vector Quantized Local Normalizing Flows [2.7998963147546148]
We introduce a novel statistical framework for learning a mixture of local normalizing flows as "chart maps" over a data manifold.
Our framework augments the expressivity of recent approaches while preserving the signature property of normalizing flows, that they admit exact density evaluation.
arXiv Detail & Related papers (2022-03-22T09:22:18Z) - Nonlinear Isometric Manifold Learning for Injective Normalizing Flows [58.720142291102135]
We use isometries to separate manifold learning and density estimation.
We also employ autoencoders to design embeddings with explicit inverses that do not distort the probability distribution.
arXiv Detail & Related papers (2022-03-08T08:57:43Z) - Resampling Base Distributions of Normalizing Flows [0.0]
We introduce a base distribution for normalizing flows based on learned rejection sampling.
We develop suitable learning algorithms using both maximizing the log-likelihood and the optimization of the reverse Kullback-Leibler divergence.
arXiv Detail & Related papers (2021-10-29T14:44:44Z) - Discrete Denoising Flows [87.44537620217673]
We introduce a new discrete flow-based model for categorical random variables: Discrete Denoising Flows (DDFs)
In contrast with other discrete flow-based models, our model can be locally trained without introducing gradient bias.
We show that DDFs outperform Discrete Flows on modeling a toy example, binary MNIST and Cityscapes segmentation maps, measured in log-likelihood.
arXiv Detail & Related papers (2021-07-24T14:47:22Z) - Semi-Supervised Learning with Normalizing Flows [54.376602201489995]
FlowGMM is an end-to-end approach to generative semi supervised learning with normalizing flows.
We show promising results on a wide range of applications, including AG-News and Yahoo Answers text data.
arXiv Detail & Related papers (2019-12-30T17:36:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.