Nonparametric Density Estimation from Markov Chains
- URL: http://arxiv.org/abs/2009.03937v1
- Date: Tue, 8 Sep 2020 18:33:42 GMT
- Title: Nonparametric Density Estimation from Markov Chains
- Authors: Andrea De Simone, Alessandro Morandini
- Abstract summary: We introduce a new nonparametric density estimator inspired by Markov Chains, and generalizing the well-known Kernel Density Estor.
Our estimator presents several benefits with respect to the usual ones and can be used straightforwardly as a foundation in all density-based algorithms.
- Score: 68.8204255655161
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce a new nonparametric density estimator inspired by Markov Chains,
and generalizing the well-known Kernel Density Estimator (KDE). Our estimator
presents several benefits with respect to the usual ones and can be used
straightforwardly as a foundation in all density-based algorithms. We prove the
consistency of our estimator and we find it typically outperforms KDE in
situations of large sample size and high dimensionality. We also employ our
density estimator to build a local outlier detector, showing very promising
results when applied to some realistic datasets.
Related papers
- Variational Weighting for Kernel Density Ratios [11.555375654882525]
Kernel density estimation (KDE) is integral to a range of generative and discriminative tasks in machine learning.
We derive an optimal weight function that reduces bias in standard kernel density estimates for density ratios, leading to improved estimates of prediction posteriors and information-theoretic measures.
arXiv Detail & Related papers (2023-11-06T10:12:19Z) - Anomaly Detection with Variance Stabilized Density Estimation [49.46356430493534]
We present a variance-stabilized density estimation problem for maximizing the likelihood of the observed samples.
To obtain a reliable anomaly detector, we introduce a spectral ensemble of autoregressive models for learning the variance-stabilized distribution.
We have conducted an extensive benchmark with 52 datasets, demonstrating that our method leads to state-of-the-art results.
arXiv Detail & Related papers (2023-06-01T11:52:58Z) - Disentangling Learning Representations with Density Estimation [9.244163477446799]
We present a method which achieves reliable disentanglement via flexible density estimation of the latent space.
GCAE achieves highly competitive and reliable disentanglement scores compared with state-of-the-art baselines.
arXiv Detail & Related papers (2023-02-08T22:37:33Z) - Fast Kernel Density Estimation with Density Matrices and Random Fourier
Features [0.0]
kernels density estimation (KDE) is one of the most widely used nonparametric density estimation methods.
DMKDE uses density matrices, a quantum mechanical formalism, and random Fourier features, an explicit kernel approximation, to produce density estimates.
DMKDE is on par with its competitors for computing density estimates and advantages are shown when performed on high-dimensional data.
arXiv Detail & Related papers (2022-08-02T02:11:10Z) - TAKDE: Temporal Adaptive Kernel Density Estimator for Real-Time Dynamic
Density Estimation [16.45003200150227]
We name the temporal adaptive kernel density estimator (TAKDE)
TAKDE is theoretically optimal in terms of the worst-case AMISE.
We provide numerical experiments using synthetic and real-world datasets, showing that TAKDE outperforms other state-of-the-art dynamic density estimators.
arXiv Detail & Related papers (2022-03-15T23:38:32Z) - Density-Based Clustering with Kernel Diffusion [59.4179549482505]
A naive density corresponding to the indicator function of a unit $d$-dimensional Euclidean ball is commonly used in density-based clustering algorithms.
We propose a new kernel diffusion density function, which is adaptive to data of varying local distributional characteristics and smoothness.
arXiv Detail & Related papers (2021-10-11T09:00:33Z) - Featurized Density Ratio Estimation [82.40706152910292]
In our work, we propose to leverage an invertible generative model to map the two distributions into a common feature space prior to estimation.
This featurization brings the densities closer together in latent space, sidestepping pathological scenarios where the learned density ratios in input space can be arbitrarily inaccurate.
At the same time, the invertibility of our feature map guarantees that the ratios computed in feature space are equivalent to those in input space.
arXiv Detail & Related papers (2021-07-05T18:30:26Z) - Meta-Learning for Relative Density-Ratio Estimation [59.75321498170363]
Existing methods for (relative) density-ratio estimation (DRE) require many instances from both densities.
We propose a meta-learning method for relative DRE, which estimates the relative density-ratio from a few instances by using knowledge in related datasets.
We empirically demonstrate the effectiveness of the proposed method by using three problems: relative DRE, dataset comparison, and outlier detection.
arXiv Detail & Related papers (2021-07-02T02:13:45Z) - Imitation with Neural Density Models [98.34503611309256]
We propose a new framework for Imitation Learning (IL) via density estimation of the expert's occupancy measure followed by Imitation Occupancy Entropy Reinforcement Learning (RL) using the density as a reward.
Our approach maximizes a non-adversarial model-free RL objective that provably lower bounds reverse Kullback-Leibler divergence between occupancy measures of the expert and imitator.
arXiv Detail & Related papers (2020-10-19T19:38:36Z) - Roundtrip: A Deep Generative Neural Density Estimator [6.704101978104295]
We propose Roundtrip as a general-purpose neural density estimator based on deep generative models.
In a series of experiments, Roundtrip achieves state-of-the-art performance in a diverse range of density estimation tasks.
arXiv Detail & Related papers (2020-04-20T01:47:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.