Efficient sampling generation from explicit densities via Normalizing
Flows
- URL: http://arxiv.org/abs/2003.10200v1
- Date: Mon, 23 Mar 2020 12:03:18 GMT
- Title: Efficient sampling generation from explicit densities via Normalizing
Flows
- Authors: Sebastian Pina-Otey and Thorsten Lux and Federico S\'anchez and Vicens
Gaitan
- Abstract summary: We will present a method based on normalizing flows, proposing a solution for the common problem of exploding reverse Kullback-Leibler divergence.
The performance of the method will be demonstrated using a multi-mode complex density function.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: For many applications, such as computing the expected value of different
magnitudes, sampling from a known probability density function, the target
density, is crucial but challenging through the inverse transform. In these
cases, rejection and importance sampling require suitable proposal densities,
which can be evaluated and sampled from efficiently. We will present a method
based on normalizing flows, proposing a solution for the common problem of
exploding reverse Kullback-Leibler divergence due to the target density having
values of 0 in regions of the flow transformation. The performance of the
method will be demonstrated using a multi-mode complex density function.
Related papers
- Diffusion Density Estimators [0.0]
We introduce a new, highly parallelizable method that computes log densities without the need to solve a flow.
Our approach is based on estimating a path integral by Monte Carlo, in a manner identical to the simulation-free training of diffusion models.
arXiv Detail & Related papers (2024-10-09T15:21:53Z) - Dynamical Measure Transport and Neural PDE Solvers for Sampling [77.38204731939273]
We tackle the task of sampling from a probability density as transporting a tractable density function to the target.
We employ physics-informed neural networks (PINNs) to approximate the respective partial differential equations (PDEs) solutions.
PINNs allow for simulation- and discretization-free optimization and can be trained very efficiently.
arXiv Detail & Related papers (2024-07-10T17:39:50Z) - Conditional Pseudo-Reversible Normalizing Flow for Surrogate Modeling in Quantifying Uncertainty Propagation [11.874729463016227]
We introduce a conditional pseudo-reversible normalizing flow for constructing surrogate models of a physical model polluted by additive noise.
The training process utilizes dataset consisting of input-output pairs without requiring prior knowledge about the noise and the function.
Our model, once trained, can generate samples from any conditional probability density functions whose high probability regions are covered by the training set.
arXiv Detail & Related papers (2024-03-31T00:09:58Z) - Sobolev Space Regularised Pre Density Models [51.558848491038916]
We propose a new approach to non-parametric density estimation that is based on regularizing a Sobolev norm of the density.
This method is statistically consistent, and makes the inductive validation model clear and consistent.
arXiv Detail & Related papers (2023-07-25T18:47:53Z) - High-dimensional and Permutation Invariant Anomaly Detection [0.1450405446885067]
We introduce a permutation-invariant density estimator for particle physics data based on diffusion models.
We demonstrate the efficacy of our methodology by utilizing the learned density as a permutation-invariant anomaly detection score.
arXiv Detail & Related papers (2023-06-06T18:01:03Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - Resampling Base Distributions of Normalizing Flows [0.0]
We introduce a base distribution for normalizing flows based on learned rejection sampling.
We develop suitable learning algorithms using both maximizing the log-likelihood and the optimization of the reverse Kullback-Leibler divergence.
arXiv Detail & Related papers (2021-10-29T14:44:44Z) - Density-Based Clustering with Kernel Diffusion [59.4179549482505]
A naive density corresponding to the indicator function of a unit $d$-dimensional Euclidean ball is commonly used in density-based clustering algorithms.
We propose a new kernel diffusion density function, which is adaptive to data of varying local distributional characteristics and smoothness.
arXiv Detail & Related papers (2021-10-11T09:00:33Z) - Density Deconvolution with Normalizing Flows [26.395910367497592]
We exploit the superior density estimation performance of normalizing flows and allow for arbitrary noise distributions.
Experiments on real data demonstrate that flows can already out-perform Gaussian mixtures for density deconvolution.
arXiv Detail & Related papers (2020-06-16T18:00:04Z) - Gaussianization Flows [113.79542218282282]
We propose a new type of normalizing flow model that enables both efficient iteration of likelihoods and efficient inversion for sample generation.
Because of this guaranteed expressivity, they can capture multimodal target distributions without compromising the efficiency of sample generation.
arXiv Detail & Related papers (2020-03-04T08:15:06Z) - Learning Likelihoods with Conditional Normalizing Flows [54.60456010771409]
Conditional normalizing flows (CNFs) are efficient in sampling and inference.
We present a study of CNFs where the base density to output space mapping is conditioned on an input x, to model conditional densities p(y|x)
arXiv Detail & Related papers (2019-11-29T19:17:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.