SoftFlow: Probabilistic Framework for Normalizing Flow on Manifolds
- URL: http://arxiv.org/abs/2006.04604v4
- Date: Sun, 15 Nov 2020 11:18:29 GMT
- Title: SoftFlow: Probabilistic Framework for Normalizing Flow on Manifolds
- Authors: Hyeongju Kim, Hyeonseung Lee, Woo Hyun Kang, Joun Yeop Lee, Nam Soo
Kim
- Abstract summary: Flow-based generative models are composed of invertible transformations between two random variables of the same dimension.
In this paper, we propose SoftFlow, a probabilistic framework for training normalizing flows on manifold.
We experimentally show that SoftFlow can capture the innate structure of the manifold data and generate high-quality samples.
We apply the proposed framework to 3D point clouds to alleviate the difficulty of forming thin structures for flow-based models.
- Score: 15.476426879806134
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Flow-based generative models are composed of invertible transformations
between two random variables of the same dimension. Therefore, flow-based
models cannot be adequately trained if the dimension of the data distribution
does not match that of the underlying target distribution. In this paper, we
propose SoftFlow, a probabilistic framework for training normalizing flows on
manifolds. To sidestep the dimension mismatch problem, SoftFlow estimates a
conditional distribution of the perturbed input data instead of learning the
data distribution directly. We experimentally show that SoftFlow can capture
the innate structure of the manifold data and generate high-quality samples
unlike the conventional flow-based models. Furthermore, we apply the proposed
framework to 3D point clouds to alleviate the difficulty of forming thin
structures for flow-based models. The proposed model for 3D point clouds,
namely SoftPointFlow, can estimate the distribution of various shapes more
accurately and achieves state-of-the-art performance in point cloud generation.
Related papers
- PaddingFlow: Improving Normalizing Flows with Padding-Dimensional Noise [4.762593660623934]
We propose PaddingFlow, a novel dequantization method, which improves normalizing flows with padding-dimensional noise.
We validate our method on the main benchmarks of unconditional density estimation.
The results show that PaddingFlow can perform better in all experiments in this paper.
arXiv Detail & Related papers (2024-03-13T03:28:39Z) - Guided Flows for Generative Modeling and Decision Making [55.42634941614435]
We show that Guided Flows significantly improves the sample quality in conditional image generation and zero-shot text synthesis-to-speech.
Notably, we are first to apply flow models for plan generation in the offline reinforcement learning setting ax speedup in compared to diffusion models.
arXiv Detail & Related papers (2023-11-22T15:07:59Z) - Diffusion models for probabilistic programming [56.47577824219207]
Diffusion Model Variational Inference (DMVI) is a novel method for automated approximate inference in probabilistic programming languages (PPLs)
DMVI is easy to implement, allows hassle-free inference in PPLs without the drawbacks of, e.g., variational inference using normalizing flows, and does not make any constraints on the underlying neural network model.
arXiv Detail & Related papers (2023-11-01T12:17:05Z) - ManiFlow: Implicitly Representing Manifolds with Normalizing Flows [145.9820993054072]
Normalizing Flows (NFs) are flexible explicit generative models that have been shown to accurately model complex real-world data distributions.
We propose an optimization objective that recovers the most likely point on the manifold given a sample from the perturbed distribution.
Finally, we focus on 3D point clouds for which we utilize the explicit nature of NFs, i.e. surface normals extracted from the gradient of the log-likelihood and the log-likelihood itself.
arXiv Detail & Related papers (2022-08-18T16:07:59Z) - VQ-Flows: Vector Quantized Local Normalizing Flows [2.7998963147546148]
We introduce a novel statistical framework for learning a mixture of local normalizing flows as "chart maps" over a data manifold.
Our framework augments the expressivity of recent approaches while preserving the signature property of normalizing flows, that they admit exact density evaluation.
arXiv Detail & Related papers (2022-03-22T09:22:18Z) - Nonlinear Isometric Manifold Learning for Injective Normalizing Flows [58.720142291102135]
We use isometries to separate manifold learning and density estimation.
We also employ autoencoders to design embeddings with explicit inverses that do not distort the probability distribution.
arXiv Detail & Related papers (2022-03-08T08:57:43Z) - Discrete Denoising Flows [87.44537620217673]
We introduce a new discrete flow-based model for categorical random variables: Discrete Denoising Flows (DDFs)
In contrast with other discrete flow-based models, our model can be locally trained without introducing gradient bias.
We show that DDFs outperform Discrete Flows on modeling a toy example, binary MNIST and Cityscapes segmentation maps, measured in log-likelihood.
arXiv Detail & Related papers (2021-07-24T14:47:22Z) - Tractable Density Estimation on Learned Manifolds with Conformal
Embedding Flows [0.0]
Normalizing flows provide tractable density estimation by transforming a simple base distribution into a complex target distribution.
Recent attempts to remedy this have introduced geometric complications that defeat a central benefit of normalizing flows: exact density estimation.
We argue that composing a standard flow with a trainable conformal embedding is the most natural way to model manifold-supported data.
arXiv Detail & Related papers (2021-06-09T18:00:00Z) - Closing the Dequantization Gap: PixelCNN as a Single-Layer Flow [16.41460104376002]
We introduce subset flows, a class of flows that can transform finite volumes and allow exact computation of likelihoods for discrete data.
We identify ordinal discrete autoregressive models, including WaveNets, PixelCNNs and Transformers, as single-layer flows.
We demonstrate state-of-the-art results on CIFAR-10 for flow models trained with dequantization.
arXiv Detail & Related papers (2020-02-06T22:58:51Z) - Semi-Supervised Learning with Normalizing Flows [54.376602201489995]
FlowGMM is an end-to-end approach to generative semi supervised learning with normalizing flows.
We show promising results on a wide range of applications, including AG-News and Yahoo Answers text data.
arXiv Detail & Related papers (2019-12-30T17:36:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.