Density Deconvolution with Normalizing Flows
- URL: http://arxiv.org/abs/2006.09396v2
- Date: Mon, 13 Jul 2020 10:58:53 GMT
- Title: Density Deconvolution with Normalizing Flows
- Authors: Tim Dockhorn, James A. Ritchie, Yaoliang Yu, Iain Murray
- Abstract summary: We exploit the superior density estimation performance of normalizing flows and allow for arbitrary noise distributions.
Experiments on real data demonstrate that flows can already out-perform Gaussian mixtures for density deconvolution.
- Score: 26.395910367497592
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Density deconvolution is the task of estimating a probability density
function given only noise-corrupted samples. We can fit a Gaussian mixture
model to the underlying density by maximum likelihood if the noise is normally
distributed, but would like to exploit the superior density estimation
performance of normalizing flows and allow for arbitrary noise distributions.
Since both adjustments lead to an intractable likelihood, we resort to
amortized variational inference. We demonstrate some problems involved in this
approach, however, experiments on real data demonstrate that flows can already
out-perform Gaussian mixtures for density deconvolution.
Related papers
- Estimating Probability Densities with Transformer and Denoising Diffusion [0.0]
We show that training a probabilistic model using a denoising diffusion head on top of the Transformer provides reasonable probability density estimation.
We illustrate our Transformer+Denoising Diffusion model by training it on a large dataset of astronomical observations and measured labels of stars within our Galaxy.
arXiv Detail & Related papers (2024-07-22T15:10:41Z) - Sobolev Space Regularised Pre Density Models [51.558848491038916]
We propose a new approach to non-parametric density estimation that is based on regularizing a Sobolev norm of the density.
This method is statistically consistent, and makes the inductive validation model clear and consistent.
arXiv Detail & Related papers (2023-07-25T18:47:53Z) - Anomaly Detection with Variance Stabilized Density Estimation [49.46356430493534]
We present a variance-stabilized density estimation problem for maximizing the likelihood of the observed samples.
To obtain a reliable anomaly detector, we introduce a spectral ensemble of autoregressive models for learning the variance-stabilized distribution.
We have conducted an extensive benchmark with 52 datasets, demonstrating that our method leads to state-of-the-art results.
arXiv Detail & Related papers (2023-06-01T11:52:58Z) - Optimizing the Noise in Self-Supervised Learning: from Importance
Sampling to Noise-Contrastive Estimation [80.07065346699005]
It is widely assumed that the optimal noise distribution should be made equal to the data distribution, as in Generative Adversarial Networks (GANs)
We turn to Noise-Contrastive Estimation which grounds this self-supervised task as an estimation problem of an energy-based model of the data.
We soberly conclude that the optimal noise may be hard to sample from, and the gain in efficiency can be modest compared to choosing the noise distribution equal to the data's.
arXiv Detail & Related papers (2023-01-23T19:57:58Z) - DensePure: Understanding Diffusion Models towards Adversarial Robustness [110.84015494617528]
We analyze the properties of diffusion models and establish the conditions under which they can enhance certified robustness.
We propose a new method DensePure, designed to improve the certified robustness of a pretrained model (i.e. a classifier)
We show that this robust region is a union of multiple convex sets, and is potentially much larger than the robust regions identified in previous works.
arXiv Detail & Related papers (2022-11-01T08:18:07Z) - Robust Inference of Manifold Density and Geometry by Doubly Stochastic
Scaling [8.271859911016719]
We develop tools for robust inference under high-dimensional noise.
We show that our approach is robust to variability in technical noise levels across cell types.
arXiv Detail & Related papers (2022-09-16T15:39:11Z) - ManiFlow: Implicitly Representing Manifolds with Normalizing Flows [145.9820993054072]
Normalizing Flows (NFs) are flexible explicit generative models that have been shown to accurately model complex real-world data distributions.
We propose an optimization objective that recovers the most likely point on the manifold given a sample from the perturbed distribution.
Finally, we focus on 3D point clouds for which we utilize the explicit nature of NFs, i.e. surface normals extracted from the gradient of the log-likelihood and the log-likelihood itself.
arXiv Detail & Related papers (2022-08-18T16:07:59Z) - Joint Manifold Learning and Density Estimation Using Normalizing Flows [4.939777212813711]
We introduce two approaches, namely per-pixel penalized log-likelihood and hierarchical training, to answer the question.
We propose a single-step method for joint manifold learning and density estimation by disentangling the transformed space.
Results validate the superiority of the proposed methods in simultaneous manifold learning and density estimation.
arXiv Detail & Related papers (2022-06-07T13:35:14Z) - Density-Based Clustering with Kernel Diffusion [59.4179549482505]
A naive density corresponding to the indicator function of a unit $d$-dimensional Euclidean ball is commonly used in density-based clustering algorithms.
We propose a new kernel diffusion density function, which is adaptive to data of varying local distributional characteristics and smoothness.
arXiv Detail & Related papers (2021-10-11T09:00:33Z) - Conditional Density Estimation via Weighted Logistic Regressions [0.30458514384586394]
We propose a novel parametric conditional density estimation method by showing the connection between the general density and the likelihood function of inhomogeneous process models.
The maximum likelihood estimates can be obtained via weighted logistic regressions, and the computation can be significantly relaxed by combining a block-wise alternating scheme and local case-control sampling.
arXiv Detail & Related papers (2020-10-21T11:08:25Z) - Efficient sampling generation from explicit densities via Normalizing
Flows [0.0]
We will present a method based on normalizing flows, proposing a solution for the common problem of exploding reverse Kullback-Leibler divergence.
The performance of the method will be demonstrated using a multi-mode complex density function.
arXiv Detail & Related papers (2020-03-23T12:03:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.