Semi-Discrete Normalizing Flows through Differentiable Tessellation
- URL: http://arxiv.org/abs/2203.06832v1
- Date: Mon, 14 Mar 2022 03:06:31 GMT
- Title: Semi-Discrete Normalizing Flows through Differentiable Tessellation
- Authors: Ricky T. Q. Chen, Brandon Amos, Maximilian Nickel
- Abstract summary: We propose a tessellation-based approach that learns quantization boundaries on a continuous space, complete with exact likelihood evaluations.
This is done through constructing normalizing flows on convex polytopes parameterized through a differentiable Voronoi tessellation.
We show improvements over existing methods across a range of structured data modalities, and find that we can achieve a significant gain from just adding Voronoi mixtures to a baseline model.
- Score: 31.474420819149724
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Mapping between discrete and continuous distributions is a difficult task and
many have had to resort to approximate or heuristical approaches. We propose a
tessellation-based approach that directly learns quantization boundaries on a
continuous space, complete with exact likelihood evaluations. This is done
through constructing normalizing flows on convex polytopes parameterized
through a differentiable Voronoi tessellation. Using a simple homeomorphism
with an efficient log determinant Jacobian, we can then cheaply parameterize
distributions on convex polytopes.
We explore this approach in two application settings, mapping from discrete
to continuous and vice versa. Firstly, a Voronoi dequantization allows
automatically learning quantization boundaries in a multidimensional space. The
location of boundaries and distances between regions can encode useful
structural relations between the quantized discrete values. Secondly, a Voronoi
mixture model has constant computation cost for likelihood evaluation
regardless of the number of mixture components. Empirically, we show
improvements over existing methods across a range of structured data
modalities, and find that we can achieve a significant gain from just adding
Voronoi mixtures to a baseline model.
Related papers
- Empirical Density Estimation based on Spline Quasi-Interpolation with
applications to Copulas clustering modeling [0.0]
Density estimation is a fundamental technique employed in various fields to model and to understand the underlying distribution of data.
In this paper we propose the mono-variate approximation of the density using quasi-interpolation.
The presented algorithm is validated on artificial and real datasets.
arXiv Detail & Related papers (2024-02-18T11:49:38Z) - Generative Modeling on Manifolds Through Mixture of Riemannian Diffusion Processes [57.396578974401734]
We introduce a principled framework for building a generative diffusion process on general manifold.
Instead of following the denoising approach of previous diffusion models, we construct a diffusion process using a mixture of bridge processes.
We develop a geometric understanding of the mixture process, deriving the drift as a weighted mean of tangent directions to the data points.
arXiv Detail & Related papers (2023-10-11T06:04:40Z) - Robust scalable initialization for Bayesian variational inference with
multi-modal Laplace approximations [0.0]
Variational mixtures with full-covariance structures suffer from a quadratic growth due to variational parameters with the number of parameters.
We propose a method for constructing an initial Gaussian model approximation that can be used to warm-start variational inference.
arXiv Detail & Related papers (2023-07-12T19:30:04Z) - Diffeomorphic Mesh Deformation via Efficient Optimal Transport for Cortical Surface Reconstruction [40.73187749820041]
Mesh deformation plays a pivotal role in many 3D vision tasks including dynamic simulations, rendering, and reconstruction.
A prevalent approach in current deep learning is the set-based approach which measures the discrepancy between two surfaces by comparing two randomly sampled point-clouds from the two meshes with Chamfer pseudo-distance.
We propose a novel metric for learning mesh deformation, defined by sliced Wasserstein distance on meshes represented as probability measures that generalize the set-based approach.
arXiv Detail & Related papers (2023-05-27T19:10:19Z) - Gradient Flows for Sampling: Mean-Field Models, Gaussian Approximations and Affine Invariance [10.153270126742369]
We study gradient flows in both probability density space and Gaussian space.
The flow in the Gaussian space may be understood as a Gaussian approximation of the flow.
arXiv Detail & Related papers (2023-02-21T21:44:08Z) - Counting Phases and Faces Using Bayesian Thermodynamic Integration [77.34726150561087]
We introduce a new approach to reconstruction of the thermodynamic functions and phase boundaries in two-parametric statistical mechanics systems.
We use the proposed approach to accurately reconstruct the partition functions and phase diagrams of the Ising model and the exactly solvable non-equilibrium TASEP.
arXiv Detail & Related papers (2022-05-18T17:11:23Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - Density Ratio Estimation via Infinitesimal Classification [85.08255198145304]
We propose DRE-infty, a divide-and-conquer approach to reduce Density ratio estimation (DRE) to a series of easier subproblems.
Inspired by Monte Carlo methods, we smoothly interpolate between the two distributions via an infinite continuum of intermediate bridge distributions.
We show that our approach performs well on downstream tasks such as mutual information estimation and energy-based modeling on complex, high-dimensional datasets.
arXiv Detail & Related papers (2021-11-22T06:26:29Z) - Continuous normalizing flows on manifolds [0.342658286826597]
We describe how the recently introduced Neural ODEs and continuous normalizing flows can be extended to arbitrary smooth manifold.
We propose a general methodology for parameterizing vector fields on these spaces and demonstrate how gradient-based learning can be performed.
arXiv Detail & Related papers (2021-03-14T15:35:19Z) - The Connection between Discrete- and Continuous-Time Descriptions of
Gaussian Continuous Processes [60.35125735474386]
We show that discretizations yielding consistent estimators have the property of invariance under coarse-graining'
This result explains why combining differencing schemes for derivatives reconstruction and local-in-time inference approaches does not work for time series analysis of second or higher order differential equations.
arXiv Detail & Related papers (2021-01-16T17:11:02Z) - Inverse Learning of Symmetries [71.62109774068064]
We learn the symmetry transformation with a model consisting of two latent subspaces.
Our approach is based on the deep information bottleneck in combination with a continuous mutual information regulariser.
Our model outperforms state-of-the-art methods on artificial and molecular datasets.
arXiv Detail & Related papers (2020-02-07T13:48:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.