Parameter estimation with uncertainty quantification from continuous measurement data using neural network ensembles
- URL: http://arxiv.org/abs/2509.10756v1
- Date: Fri, 12 Sep 2025 23:58:44 GMT
- Title: Parameter estimation with uncertainty quantification from continuous measurement data using neural network ensembles
- Authors: Amanuel Anteneh,
- Abstract summary: ensembles of deep neural networks, called deep ensembles, can be used to perform quantum parameter estimation.<n>We show that much less data is needed to achieve comparable performance to Bayesian inference based estimation.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We show that ensembles of deep neural networks, called deep ensembles, can be used to perform quantum parameter estimation while also providing a means for quantifying uncertainty in parameter estimates, which is a key advantage of using Bayesian inference for parameter estimation. These models are shown to be more robust to noise in the measurement results used to perform the parameter estimation as well as noise in the data used to train them. We also show that much less data is needed to achieve comparable performance to Bayesian inference based estimation, which is known to reach the ultimate precision limit as more data is collected, than was used in previous proposals.
Related papers
- Fast Likelihood-Free Parameter Estimation for Lévy Processes [0.0]
We propose a fast and accurate method for L'evy parameter estimation using the neural Bayes estimation framework.<n>We show that NBE results in consistent estimators whose risk converges to the Bayes estimator under mild conditions.<n>We also investigate nearly a decade of high-frequency Bitcoin returns, requiring less than one minute to estimate parameters under the proposed approach.
arXiv Detail & Related papers (2025-05-03T00:37:58Z) - In-Context Parametric Inference: Point or Distribution Estimators? [66.22308335324239]
We show that amortized point estimators generally outperform posterior inference, though the latter remain competitive in some low-dimensional problems.<n>Our experiments indicate that amortized point estimators generally outperform posterior inference, though the latter remain competitive in some low-dimensional problems.
arXiv Detail & Related papers (2025-02-17T10:00:24Z) - Fundamental bounds for parameter estimation with few measurements [0.0]
We discuss different linear (Barankin-like) conditions that can be imposed on estimators and analyze when these conditions admit an optimal estimator with finite variance.
We show that, if the number of imposed conditions is larger than the number of measurement outcomes, there generally does not exist a corresponding estimator with finite variance.
We derive an extended Cram'er-Rao bound that is compatible with a finite variance in situations where the Barankin bound is undefined.
arXiv Detail & Related papers (2024-02-22T12:40:08Z) - Tractable Function-Space Variational Inference in Bayesian Neural
Networks [72.97620734290139]
A popular approach for estimating the predictive uncertainty of neural networks is to define a prior distribution over the network parameters.
We propose a scalable function-space variational inference method that allows incorporating prior information.
We show that the proposed method leads to state-of-the-art uncertainty estimation and predictive performance on a range of prediction tasks.
arXiv Detail & Related papers (2023-12-28T18:33:26Z) - Fisher information susceptibility for multiparameter quantum estimation [0.23436632098950458]
Noise affects the performance of quantum technologies, hence the importance of elaborating operative figures of merit.
In quantum metrology, the introduction of the Fisher information measurement noise susceptibility now allows to quantify the robustness of measurement.
We provide its mathematical definition in the form of a semidefinite program.
arXiv Detail & Related papers (2023-12-04T16:54:01Z) - Parameter estimation by learning quantum correlations in continuous
photon-counting data using neural networks [0.21990652930491852]
We present an inference method utilizing artificial neural networks for parameter estimation of a quantum probe monitored through a single measurement.
We benchmark the precision of this method against Bayesian inference, which is optimal in the sense of information retrieval.
This approach offers a promising and computationally efficient tool for quantum parameter estimation with photon-counting data, relevant for applications such as quantum sensing or quantum imaging.
arXiv Detail & Related papers (2023-10-03T18:00:02Z) - Homodyned K-distribution: parameter estimation and uncertainty
quantification using Bayesian neural networks [2.599882743586164]
The parameters of Homodyned K-distribution (HK-distribution) are the speckle statistics that can model the envelope data in diverse scattering conditions.
We propose a Bayesian Neural Network (BNN) to estimate the parameters of HK-distribution and quantify the uncertainty of the estimator.
arXiv Detail & Related papers (2022-10-31T22:38:33Z) - Dense Uncertainty Estimation [62.23555922631451]
In this paper, we investigate neural networks and uncertainty estimation techniques to achieve both accurate deterministic prediction and reliable uncertainty estimation.
We work on two types of uncertainty estimations solutions, namely ensemble based methods and generative model based methods, and explain their pros and cons while using them in fully/semi/weakly-supervised framework.
arXiv Detail & Related papers (2021-10-13T01:23:48Z) - Machine Learning Based Parameter Estimation of Gaussian Quantum States [14.85374185122389]
We propose a machine learning framework for parameter estimation of single mode Gaussian quantum states.
Under a Bayesian framework, our approach estimates parameters of suitable prior distributions from measured data.
arXiv Detail & Related papers (2021-08-13T04:59:16Z) - Real-time gravitational-wave science with neural posterior estimation [64.67121167063696]
We demonstrate unprecedented accuracy for rapid gravitational-wave parameter estimation with deep learning.
We analyze eight gravitational-wave events from the first LIGO-Virgo Gravitational-Wave Transient Catalog.
We find very close quantitative agreement with standard inference codes, but with inference times reduced from O(day) to a minute per event.
arXiv Detail & Related papers (2021-06-23T18:00:05Z) - Sampling-free Variational Inference for Neural Networks with
Multiplicative Activation Noise [51.080620762639434]
We propose a more efficient parameterization of the posterior approximation for sampling-free variational inference.
Our approach yields competitive results for standard regression problems and scales well to large-scale image classification tasks.
arXiv Detail & Related papers (2021-03-15T16:16:18Z) - Orthogonal Statistical Learning [49.55515683387805]
We provide non-asymptotic excess risk guarantees for statistical learning in a setting where the population risk depends on an unknown nuisance parameter.
We show that if the population risk satisfies a condition called Neymanity, the impact of the nuisance estimation error on the excess risk bound achieved by the meta-algorithm is of second order.
arXiv Detail & Related papers (2019-01-25T02:21:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.