Fundamental bounds for parameter estimation with few measurements
- URL: http://arxiv.org/abs/2402.14495v1
- Date: Thu, 22 Feb 2024 12:40:08 GMT
- Title: Fundamental bounds for parameter estimation with few measurements
- Authors: Valentin Gebhart, Manuel Gessner, Augusto Smerzi
- Abstract summary: We discuss different linear (Barankin-like) conditions that can be imposed on estimators and analyze when these conditions admit an optimal estimator with finite variance.
We show that, if the number of imposed conditions is larger than the number of measurement outcomes, there generally does not exist a corresponding estimator with finite variance.
We derive an extended Cram'er-Rao bound that is compatible with a finite variance in situations where the Barankin bound is undefined.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Bounding the optimal precision in parameter estimation tasks is of central
importance for technological applications. In the regime of a small number of
measurements, or that of low signal-to-noise ratios, the meaning of common
frequentist bounds such as the Cram\'er-Rao bound (CRB) become questionable.
Here, we discuss different linear (Barankin-like) conditions that can be
imposed on estimators and analyze when these conditions admit an optimal
estimator with finite variance, for any number of measurement repetitions. We
show that, if the number of imposed conditions is larger than the number of
measurement outcomes, there generally does not exist a corresponding estimator
with finite variance. We analyze this result from different viewpoints and
examples and elaborate on connections to the shot-noise limit and the Kitaev
phase estimation algorithm. We then derive an extended Cram\'er-Rao bound that
is compatible with a finite variance in situations where the Barankin bound is
undefined. Finally, we show an exemplary numerical confrontation between
frequentist and Bayesian approaches to parameter estimation.
Related papers
- Multivariate root-n-consistent smoothing parameter free matching estimators and estimators of inverse density weighted expectations [51.000851088730684]
We develop novel modifications of nearest-neighbor and matching estimators which converge at the parametric $sqrt n $-rate.
We stress that our estimators do not involve nonparametric function estimators and in particular do not rely on sample-size dependent parameters smoothing.
arXiv Detail & Related papers (2024-07-11T13:28:34Z) - On the existence of unbiased resilient estimators in discrete quantum
systems [0.0]
We compare the performance of Cram'er-Rao and Bhattacharyya bounds when faced with less-than-ideal prior knowledge of the parameter.
For a given system dimension, one can construct estimators in quantum systems that exhibit increased robustness to prior ignorance.
arXiv Detail & Related papers (2024-02-23T10:12:35Z) - Intrinsic Bayesian Cramér-Rao Bound with an Application to Covariance Matrix Estimation [49.67011673289242]
This paper presents a new performance bound for estimation problems where the parameter to estimate lies in a smooth manifold.
It induces a geometry for the parameter manifold, as well as an intrinsic notion of the estimation error measure.
arXiv Detail & Related papers (2023-11-08T15:17:13Z) - Overlapping Batch Confidence Intervals on Statistical Functionals
Constructed from Time Series: Application to Quantiles, Optimization, and
Estimation [5.068678962285631]
We propose a confidence interval procedure for statistical functionals constructed using data from a stationary time series.
The OBx limits, certain functionals of the Wiener process parameterized by the size of the batches and the extent of their overlap, form the essential machinery for characterizing dependence.
arXiv Detail & Related papers (2023-07-17T16:21:48Z) - Kernel-based off-policy estimation without overlap: Instance optimality
beyond semiparametric efficiency [53.90687548731265]
We study optimal procedures for estimating a linear functional based on observational data.
For any convex and symmetric function class $mathcalF$, we derive a non-asymptotic local minimax bound on the mean-squared error.
arXiv Detail & Related papers (2023-01-16T02:57:37Z) - Nonparametric Quantile Regression: Non-Crossing Constraints and
Conformal Prediction [2.654399717608053]
We propose a nonparametric quantile regression method using deep neural networks with a rectified linear unit penalty function to avoid quantile crossing.
We establish non-asymptotic upper bounds for the excess risk of the proposed nonparametric quantile regression function estimators.
Numerical experiments including simulation studies and a real data example are conducted to demonstrate the effectiveness of the proposed method.
arXiv Detail & Related papers (2022-10-18T20:59:48Z) - Off-policy estimation of linear functionals: Non-asymptotic theory for
semi-parametric efficiency [59.48096489854697]
The problem of estimating a linear functional based on observational data is canonical in both the causal inference and bandit literatures.
We prove non-asymptotic upper bounds on the mean-squared error of such procedures.
We establish its instance-dependent optimality in finite samples via matching non-asymptotic local minimax lower bounds.
arXiv Detail & Related papers (2022-09-26T23:50:55Z) - Divergence Frontiers for Generative Models: Sample Complexity,
Quantization Level, and Frontier Integral [58.434753643798224]
Divergence frontiers have been proposed as an evaluation framework for generative models.
We establish non-asymptotic bounds on the sample complexity of the plug-in estimator of divergence frontiers.
We also augment the divergence frontier framework by investigating the statistical performance of smoothed distribution estimators.
arXiv Detail & Related papers (2021-06-15T06:26:25Z) - Bayesian parameter estimation using Gaussian states and measurements [0.0]
We consider three paradigmatic estimation schemes in continuous-variable quantum metrology.
We investigate the precision achievable with single-mode Gaussian states under homodyne and heterodyne detection.
This allows us to identify Bayesian estimation strategies that combine good performance with the potential for straightforward experimental realization.
arXiv Detail & Related papers (2020-09-08T12:54:12Z) - Minimax Optimal Estimation of KL Divergence for Continuous Distributions [56.29748742084386]
Esting Kullback-Leibler divergence from identical and independently distributed samples is an important problem in various domains.
One simple and effective estimator is based on the k nearest neighbor between these samples.
arXiv Detail & Related papers (2020-02-26T16:37:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.