Integrable Nonparametric Flows
- URL: http://arxiv.org/abs/2012.02035v1
- Date: Thu, 3 Dec 2020 16:19:52 GMT
- Title: Integrable Nonparametric Flows
- Authors: David Pfau, Danilo Rezende
- Abstract summary: We introduce a method for reconstructing an infinitesimal normalizing flow given only an infinitesimal change to a probability distribution.
This reverses the conventional task of normalizing flows.
We discuss potential applications to problems in quantum Monte Carlo and machine learning.
- Score: 5.9774834479750805
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce a method for reconstructing an infinitesimal normalizing flow
given only an infinitesimal change to a (possibly unnormalized) probability
distribution. This reverses the conventional task of normalizing flows --
rather than being given samples from a unknown target distribution and learning
a flow that approximates the distribution, we are given a perturbation to an
initial distribution and aim to reconstruct a flow that would generate samples
from the known perturbed distribution. While this is an underdetermined
problem, we find that choosing the flow to be an integrable vector field yields
a solution closely related to electrostatics, and a solution can be computed by
the method of Green's functions. Unlike conventional normalizing flows, this
flow can be represented in an entirely nonparametric manner. We validate this
derivation on low-dimensional problems, and discuss potential applications to
problems in quantum Monte Carlo and machine learning.
Related papers
- Detecting and Mitigating Mode-Collapse for Flow-based Sampling of
Lattice Field Theories [6.222204646855336]
We study the consequences of mode-collapse of normalizing flows in the context of lattice field theory.
We propose a metric to quantify the degree of mode-collapse and derive a bound on the resulting bias.
arXiv Detail & Related papers (2023-02-27T19:00:22Z) - Gradient Flows for Sampling: Mean-Field Models, Gaussian Approximations and Affine Invariance [10.153270126742369]
We study gradient flows in both probability density space and Gaussian space.
The flow in the Gaussian space may be understood as a Gaussian approximation of the flow.
arXiv Detail & Related papers (2023-02-21T21:44:08Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - An application of the splitting-up method for the computation of a
neural network representation for the solution for the filtering equations [68.8204255655161]
Filtering equations play a central role in many real-life applications, including numerical weather prediction, finance and engineering.
One of the classical approaches to approximate the solution of the filtering equations is to use a PDE inspired method, called the splitting-up method.
We combine this method with a neural network representation to produce an approximation of the unnormalised conditional distribution of the signal process.
arXiv Detail & Related papers (2022-01-10T11:01:36Z) - Resampling Base Distributions of Normalizing Flows [0.0]
We introduce a base distribution for normalizing flows based on learned rejection sampling.
We develop suitable learning algorithms using both maximizing the log-likelihood and the optimization of the reverse Kullback-Leibler divergence.
arXiv Detail & Related papers (2021-10-29T14:44:44Z) - Stochastic Normalizing Flows for Inverse Problems: a Markov Chains
Viewpoint [0.45119235878273]
We consider normalizing flows from a Markov chain point of view.
We replace transition densities by general Markov kernels and establish proofs via Radon-Nikodym derivatives.
The performance of the proposed conditional normalizing flow is demonstrated by numerical examples.
arXiv Detail & Related papers (2021-09-23T13:44:36Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z) - Variational Transport: A Convergent Particle-BasedAlgorithm for Distributional Optimization [106.70006655990176]
A distributional optimization problem arises widely in machine learning and statistics.
We propose a novel particle-based algorithm, dubbed as variational transport, which approximately performs Wasserstein gradient descent.
We prove that when the objective function satisfies a functional version of the Polyak-Lojasiewicz (PL) (Polyak, 1963) and smoothness conditions, variational transport converges linearly.
arXiv Detail & Related papers (2020-12-21T18:33:13Z) - Gaussianization Flows [113.79542218282282]
We propose a new type of normalizing flow model that enables both efficient iteration of likelihoods and efficient inversion for sample generation.
Because of this guaranteed expressivity, they can capture multimodal target distributions without compromising the efficiency of sample generation.
arXiv Detail & Related papers (2020-03-04T08:15:06Z) - Composing Normalizing Flows for Inverse Problems [89.06155049265641]
We propose a framework for approximate inference that estimates the target conditional as a composition of two flow models.
Our method is evaluated on a variety of inverse problems and is shown to produce high-quality samples with uncertainty.
arXiv Detail & Related papers (2020-02-26T19:01:11Z) - Stochastic Normalizing Flows [2.323220706791067]
We show that normalizing flows can be used to learn the transformation of a simple prior distribution.
We derive an efficient training procedure by which both the sampler's and the flow's parameters can be optimized end-to-end.
We illustrate the representational power, sampling efficiency and correctness of SNFs on several benchmarks including applications to molecular sampling systems in equilibrium.
arXiv Detail & Related papers (2020-02-16T23:29:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.