A Non-Classical Parameterization for Density Estimation Using Sample
Moments
- URL: http://arxiv.org/abs/2201.04786v5
- Date: Tue, 4 Jul 2023 14:42:44 GMT
- Title: A Non-Classical Parameterization for Density Estimation Using Sample
Moments
- Authors: Guangyu Wu, Anders Lindquist
- Abstract summary: We propose a non-classical parametrization for density estimation using sample moments.
The proposed estimator is the first one in the literature for which the power moments up to an arbitrary even order exactly match the sample moments.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Probability density estimation is a core problem of statistics and signal
processing. Moment methods are an important means of density estimation, but
they are generally strongly dependent on the choice of feasible functions,
which severely affects the performance. In this paper, we propose a
non-classical parametrization for density estimation using sample moments,
which does not require the choice of such functions. The parametrization is
induced by the squared Hellinger distance, and the solution of it, which is
proved to exist and be unique subject to a simple prior that does not depend on
data, and can be obtained by convex optimization. Statistical properties of the
density estimator, together with an asymptotic error upper bound are proposed
for the estimator by power moments. Applications of the proposed density
estimator in signal processing tasks are given. Simulation results validate the
performance of the estimator by a comparison to several prevailing methods. To
the best of our knowledge, the proposed estimator is the first one in the
literature for which the power moments up to an arbitrary even order exactly
match the sample moments, while the true density is not assumed to fall within
specific function classes.
Related papers
- A quasi-Bayesian sequential approach to deconvolution density estimation [7.10052009802944]
Density deconvolution addresses the estimation of the unknown density function $f$ of a random signal from data.
We consider the problem of density deconvolution in a streaming or online setting where noisy data arrive progressively.
By relying on a quasi-Bayesian sequential approach, we obtain estimates of $f$ that are of easy evaluation.
arXiv Detail & Related papers (2024-08-26T16:40:04Z) - Score Matching-based Pseudolikelihood Estimation of Neural Marked
Spatio-Temporal Point Process with Uncertainty Quantification [59.81904428056924]
We introduce SMASH: a Score MAtching estimator for learning markedPs with uncertainty quantification.
Specifically, our framework adopts a normalization-free objective by estimating the pseudolikelihood of markedPs through score-matching.
The superior performance of our proposed framework is demonstrated through extensive experiments in both event prediction and uncertainty quantification.
arXiv Detail & Related papers (2023-10-25T02:37:51Z) - Sobolev Space Regularised Pre Density Models [51.558848491038916]
We propose a new approach to non-parametric density estimation that is based on regularizing a Sobolev norm of the density.
This method is statistically consistent, and makes the inductive validation model clear and consistent.
arXiv Detail & Related papers (2023-07-25T18:47:53Z) - Learning Unnormalized Statistical Models via Compositional Optimization [73.30514599338407]
Noise-contrastive estimation(NCE) has been proposed by formulating the objective as the logistic loss of the real data and the artificial noise.
In this paper, we study it a direct approach for optimizing the negative log-likelihood of unnormalized models.
arXiv Detail & Related papers (2023-06-13T01:18:16Z) - MESSY Estimation: Maximum-Entropy based Stochastic and Symbolic densitY
Estimation [4.014524824655106]
MESSY estimation is a Maximum-Entropy based Gradient and Symbolic densitY estimation method.
We construct a gradient-based drift-diffusion process that connects samples of the unknown distribution function to a guess symbolic expression.
We find that the addition of a symbolic search for basis functions improves the accuracy of the estimation at a reasonable additional computational cost.
arXiv Detail & Related papers (2023-06-07T03:28:47Z) - Anomaly Detection with Variance Stabilized Density Estimation [49.46356430493534]
We present a variance-stabilized density estimation problem for maximizing the likelihood of the observed samples.
To obtain a reliable anomaly detector, we introduce a spectral ensemble of autoregressive models for learning the variance-stabilized distribution.
We have conducted an extensive benchmark with 52 datasets, demonstrating that our method leads to state-of-the-art results.
arXiv Detail & Related papers (2023-06-01T11:52:58Z) - Statistical Efficiency of Score Matching: The View from Isoperimetry [96.65637602827942]
We show a tight connection between statistical efficiency of score matching and the isoperimetric properties of the distribution being estimated.
We formalize these results both in the sample regime and in the finite regime.
arXiv Detail & Related papers (2022-10-03T06:09:01Z) - Tensor-Train Density Estimation [16.414910030716555]
We propose a new efficient tensor train-based model for density estimation (TTDE)
Such density parametrization allows exact sampling, calculation of cumulative and marginal density functions, and partition function.
We show that TTDE significantly outperforms competitors in training speed.
arXiv Detail & Related papers (2021-07-30T21:51:12Z) - Efficient Interpolation of Density Estimators [23.154249845820306]
We study the problem of space and time efficient evaluation of a nonparametric estimator that approximates an unknown density.
Our result gives a new statistical perspective on the problem of fast evaluation of kernel density estimators in the presence of underlying smoothness.
arXiv Detail & Related papers (2020-11-10T06:05:00Z) - Low-rank Characteristic Tensor Density Estimation Part I: Foundations [38.05393186002834]
We propose a novel approach that builds upon tensor factorization tools.
In order to circumvent the curse of dimensionality, we introduce a low-rank model of this characteristic tensor.
We demonstrate the very promising performance of the proposed method using several measured datasets.
arXiv Detail & Related papers (2020-08-27T18:06:19Z) - Instability, Computational Efficiency and Statistical Accuracy [101.32305022521024]
We develop a framework that yields statistical accuracy based on interplay between the deterministic convergence rate of the algorithm at the population level, and its degree of (instability) when applied to an empirical object based on $n$ samples.
We provide applications of our general results to several concrete classes of models, including Gaussian mixture estimation, non-linear regression models, and informative non-response models.
arXiv Detail & Related papers (2020-05-22T22:30:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.