TAKDE: Temporal Adaptive Kernel Density Estimator for Real-Time Dynamic
Density Estimation
- URL: http://arxiv.org/abs/2203.08317v2
- Date: Wed, 8 Nov 2023 19:25:25 GMT
- Title: TAKDE: Temporal Adaptive Kernel Density Estimator for Real-Time Dynamic
Density Estimation
- Authors: Yinsong Wang, Yu Ding, Shahin Shahrampour
- Abstract summary: We name the temporal adaptive kernel density estimator (TAKDE)
TAKDE is theoretically optimal in terms of the worst-case AMISE.
We provide numerical experiments using synthetic and real-world datasets, showing that TAKDE outperforms other state-of-the-art dynamic density estimators.
- Score: 16.45003200150227
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Real-time density estimation is ubiquitous in many applications, including
computer vision and signal processing. Kernel density estimation is arguably
one of the most commonly used density estimation techniques, and the use of
"sliding window" mechanism adapts kernel density estimators to dynamic
processes. In this paper, we derive the asymptotic mean integrated squared
error (AMISE) upper bound for the "sliding window" kernel density estimator.
This upper bound provides a principled guide to devise a novel estimator, which
we name the temporal adaptive kernel density estimator (TAKDE). Compared to
heuristic approaches for "sliding window" kernel density estimator, TAKDE is
theoretically optimal in terms of the worst-case AMISE. We provide numerical
experiments using synthetic and real-world datasets, showing that TAKDE
outperforms other state-of-the-art dynamic density estimators (including those
outside of kernel family). In particular, TAKDE achieves a superior test
log-likelihood with a smaller runtime.
Related papers
- Tracking Dynamic Gaussian Density with a Theoretically Optimal Sliding
Window Approach [16.45003200150227]
We study the exact mean integrated squared error (MISE) of "sliding window" Kernel Density Estimators for evolving densities.
We present empirical evidence with synthetic datasets to show that our weighting scheme improves the tracking performance.
arXiv Detail & Related papers (2024-03-11T23:21:26Z) - Fast Kernel Density Estimation with Density Matrices and Random Fourier
Features [0.0]
kernels density estimation (KDE) is one of the most widely used nonparametric density estimation methods.
DMKDE uses density matrices, a quantum mechanical formalism, and random Fourier features, an explicit kernel approximation, to produce density estimates.
DMKDE is on par with its competitors for computing density estimates and advantages are shown when performed on high-dimensional data.
arXiv Detail & Related papers (2022-08-02T02:11:10Z) - Learning Transfer Operators by Kernel Density Estimation [0.0]
We recast the problem within the framework of statistical density estimation.
We demonstrate the validity and effectiveness of this approach in estimating the eigenvectors of the Frobenius-Perron operator.
We suggest the possibility of incorporating other density estimation methods into this field.
arXiv Detail & Related papers (2022-08-01T14:28:10Z) - Quantum Adaptive Fourier Features for Neural Density Estimation [0.0]
This paper presents a method for neural density estimation that can be seen as a type of kernel density estimation.
The method is based on density matrices, a formalism used in quantum mechanics, and adaptive Fourier features.
The method was evaluated in different synthetic and real datasets, and its performance compared against state-of-the-art neural density estimation methods.
arXiv Detail & Related papers (2022-08-01T01:39:11Z) - Density-Based Clustering with Kernel Diffusion [59.4179549482505]
A naive density corresponding to the indicator function of a unit $d$-dimensional Euclidean ball is commonly used in density-based clustering algorithms.
We propose a new kernel diffusion density function, which is adaptive to data of varying local distributional characteristics and smoothness.
arXiv Detail & Related papers (2021-10-11T09:00:33Z) - Tensor-Train Density Estimation [16.414910030716555]
We propose a new efficient tensor train-based model for density estimation (TTDE)
Such density parametrization allows exact sampling, calculation of cumulative and marginal density functions, and partition function.
We show that TTDE significantly outperforms competitors in training speed.
arXiv Detail & Related papers (2021-07-30T21:51:12Z) - Meta-Learning for Relative Density-Ratio Estimation [59.75321498170363]
Existing methods for (relative) density-ratio estimation (DRE) require many instances from both densities.
We propose a meta-learning method for relative DRE, which estimates the relative density-ratio from a few instances by using knowledge in related datasets.
We empirically demonstrate the effectiveness of the proposed method by using three problems: relative DRE, dataset comparison, and outlier detection.
arXiv Detail & Related papers (2021-07-02T02:13:45Z) - A Note on Optimizing Distributions using Kernel Mean Embeddings [94.96262888797257]
Kernel mean embeddings represent probability measures by their infinite-dimensional mean embeddings in a reproducing kernel Hilbert space.
We show that when the kernel is characteristic, distributions with a kernel sum-of-squares density are dense.
We provide algorithms to optimize such distributions in the finite-sample setting.
arXiv Detail & Related papers (2021-06-18T08:33:45Z) - GBHT: Gradient Boosting Histogram Transform for Density Estimation [73.94900378709023]
We propose a density estimation algorithm called textitGradient Boosting Histogram Transform (GBHT)
We make the first attempt to theoretically explain why boosting can enhance the performance of its base learners for density estimation problems.
arXiv Detail & Related papers (2021-06-10T13:40:28Z) - Imitation with Neural Density Models [98.34503611309256]
We propose a new framework for Imitation Learning (IL) via density estimation of the expert's occupancy measure followed by Imitation Occupancy Entropy Reinforcement Learning (RL) using the density as a reward.
Our approach maximizes a non-adversarial model-free RL objective that provably lower bounds reverse Kullback-Leibler divergence between occupancy measures of the expert and imitator.
arXiv Detail & Related papers (2020-10-19T19:38:36Z) - Nonparametric Density Estimation from Markov Chains [68.8204255655161]
We introduce a new nonparametric density estimator inspired by Markov Chains, and generalizing the well-known Kernel Density Estor.
Our estimator presents several benefits with respect to the usual ones and can be used straightforwardly as a foundation in all density-based algorithms.
arXiv Detail & Related papers (2020-09-08T18:33:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.