Roundtrip: A Deep Generative Neural Density Estimator
- URL: http://arxiv.org/abs/2004.09017v4
- Date: Fri, 4 Sep 2020 07:17:33 GMT
- Title: Roundtrip: A Deep Generative Neural Density Estimator
- Authors: Qiao Liu, Jiaze Xu, Rui Jiang, Wing Hung Wong
- Abstract summary: We propose Roundtrip as a general-purpose neural density estimator based on deep generative models.
In a series of experiments, Roundtrip achieves state-of-the-art performance in a diverse range of density estimation tasks.
- Score: 6.704101978104295
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Density estimation is a fundamental problem in both statistics and machine
learning. In this study, we proposed Roundtrip as a general-purpose neural
density estimator based on deep generative models. Roundtrip retains the
generative power of generative adversarial networks (GANs) but also provides
estimates of density values. Unlike previous neural density estimators that put
stringent conditions on the transformation from the latent space to the data
space, Roundtrip enables the use of much more general mappings. In a series of
experiments, Roundtrip achieves state-of-the-art performance in a diverse range
of density estimation tasks.
Related papers
- Towards the Uncharted: Density-Descending Feature Perturbation for Semi-supervised Semantic Segmentation [51.66997548477913]
We propose a novel feature-level consistency learning framework named Density-Descending Feature Perturbation (DDFP)
Inspired by the low-density separation assumption in semi-supervised learning, our key insight is that feature density can shed a light on the most promising direction for the segmentation classifier to explore.
The proposed DDFP outperforms other designs on feature-level perturbations and shows state of the art performances on both Pascal VOC and Cityscapes dataset.
arXiv Detail & Related papers (2024-03-11T06:59:05Z) - Quantum Adaptive Fourier Features for Neural Density Estimation [0.0]
This paper presents a method for neural density estimation that can be seen as a type of kernel density estimation.
The method is based on density matrices, a formalism used in quantum mechanics, and adaptive Fourier features.
The method was evaluated in different synthetic and real datasets, and its performance compared against state-of-the-art neural density estimation methods.
arXiv Detail & Related papers (2022-08-01T01:39:11Z) - Rethinking Spatial Invariance of Convolutional Networks for Object
Counting [119.83017534355842]
We try to use locally connected Gaussian kernels to replace the original convolution filter to estimate the spatial position in the density map.
Inspired by previous work, we propose a low-rank approximation accompanied with translation invariance to favorably implement the approximation of massive Gaussian convolution.
Our methods significantly outperform other state-of-the-art methods and achieve promising learning of the spatial position of objects.
arXiv Detail & Related papers (2022-06-10T17:51:25Z) - TAKDE: Temporal Adaptive Kernel Density Estimator for Real-Time Dynamic
Density Estimation [16.45003200150227]
We name the temporal adaptive kernel density estimator (TAKDE)
TAKDE is theoretically optimal in terms of the worst-case AMISE.
We provide numerical experiments using synthetic and real-world datasets, showing that TAKDE outperforms other state-of-the-art dynamic density estimators.
arXiv Detail & Related papers (2022-03-15T23:38:32Z) - Density-Based Clustering with Kernel Diffusion [59.4179549482505]
A naive density corresponding to the indicator function of a unit $d$-dimensional Euclidean ball is commonly used in density-based clustering algorithms.
We propose a new kernel diffusion density function, which is adaptive to data of varying local distributional characteristics and smoothness.
arXiv Detail & Related papers (2021-10-11T09:00:33Z) - Featurized Density Ratio Estimation [82.40706152910292]
In our work, we propose to leverage an invertible generative model to map the two distributions into a common feature space prior to estimation.
This featurization brings the densities closer together in latent space, sidestepping pathological scenarios where the learned density ratios in input space can be arbitrarily inaccurate.
At the same time, the invertibility of our feature map guarantees that the ratios computed in feature space are equivalent to those in input space.
arXiv Detail & Related papers (2021-07-05T18:30:26Z) - Deep Data Density Estimation through Donsker-Varadhan Representation [5.276937617129594]
We present a simple yet effective method for estimating the data density using a deep neural network and the Donsker-Varadhan variational lower bound on the KL divergence.
We show that the optimal critic function associated with the Donsker-Varadhan representation on the divergence between the data and the uniform distribution can estimate the data density.
arXiv Detail & Related papers (2021-04-14T03:38:32Z) - Nonparametric Density Estimation from Markov Chains [68.8204255655161]
We introduce a new nonparametric density estimator inspired by Markov Chains, and generalizing the well-known Kernel Density Estor.
Our estimator presents several benefits with respect to the usual ones and can be used straightforwardly as a foundation in all density-based algorithms.
arXiv Detail & Related papers (2020-09-08T18:33:42Z) - Variable Skipping for Autoregressive Range Density Estimation [84.60428050170687]
We show a technique, variable skipping, for accelerating range density estimation over deep autoregressive models.
We show that variable skipping provides 10-100$times$ efficiency improvements when targeting challenging high-quantile error metrics.
arXiv Detail & Related papers (2020-07-10T19:01:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.