Machine Learning Based Parameter Estimation of Gaussian Quantum States
- URL: http://arxiv.org/abs/2108.06061v1
- Date: Fri, 13 Aug 2021 04:59:16 GMT
- Title: Machine Learning Based Parameter Estimation of Gaussian Quantum States
- Authors: Neel Kanth Kundu, Matthew R. McKay, and Ranjan K. Mallik
- Abstract summary: We propose a machine learning framework for parameter estimation of single mode Gaussian quantum states.
Under a Bayesian framework, our approach estimates parameters of suitable prior distributions from measured data.
- Score: 14.85374185122389
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a machine learning framework for parameter estimation of single
mode Gaussian quantum states. Under a Bayesian framework, our approach
estimates parameters of suitable prior distributions from measured data. For
phase-space displacement and squeezing parameter estimation, this is achieved
by introducing Expectation-Maximization (EM) based algorithms, while for phase
parameter estimation an empirical Bayes method is applied. The estimated prior
distribution parameters along with the observed data are used for finding the
optimal Bayesian estimate of the unknown displacement, squeezing and phase
parameters. Our simulation results show that the proposed algorithms have
estimation performance that is very close to that of Genie Aided Bayesian
estimators, that assume perfect knowledge of the prior parameters. Our proposed
methods can be utilized by experimentalists to find the optimum Bayesian
estimate of parameters of Gaussian quantum states by using only the observed
measurements without requiring any knowledge about the prior distribution
parameters.
Related papers
- Efficient Learning of POMDPs with Known Observation Model in Average-Reward Setting [56.92178753201331]
We propose the Observation-Aware Spectral (OAS) estimation technique, which enables the POMDP parameters to be learned from samples collected using a belief-based policy.
We show the consistency of the OAS procedure, and we prove a regret guarantee of order $mathcalO(sqrtT log(T)$ for the proposed OAS-UCRL algorithm.
arXiv Detail & Related papers (2024-10-02T08:46:34Z) - Scaling Exponents Across Parameterizations and Optimizers [94.54718325264218]
We propose a new perspective on parameterization by investigating a key assumption in prior work.
Our empirical investigation includes tens of thousands of models trained with all combinations of threes.
We find that the best learning rate scaling prescription would often have been excluded by the assumptions in prior work.
arXiv Detail & Related papers (2024-07-08T12:32:51Z) - Parameter Estimation in Quantum Metrology Technique for Time Series Prediction [0.0]
The paper investigates the techniques of quantum computation in metrological predictions.
It focuses on enhancing prediction potential through variational parameter estimation.
The impacts of various parameter distributions and learning rates on predictive accuracy are investigated.
arXiv Detail & Related papers (2024-06-12T05:55:45Z) - QestOptPOVM: An iterative algorithm to find optimal measurements for quantum parameter estimation [17.305295658536828]
We introduce an algorithm, termed QestPOVM, designed to directly identify optimal positive operator-Opt measure (POVM)
Through rigorous testing on several examples for multiple copies of qubit states (up to six copies), we demonstrate the efficiency and accuracy of our proposed algorithm.
Our algorithm functions as a tool for elucidating the explicit forms of optimal POVMs, thereby enhancing our understanding of quantum parameter estimation methodologies.
arXiv Detail & Related papers (2024-03-29T11:46:09Z) - Finding the optimal probe state for multiparameter quantum metrology
using conic programming [61.98670278625053]
We present a conic programming framework that allows us to determine the optimal probe state for the corresponding precision bounds.
We also apply our theory to analyze the canonical field sensing problem using entangled quantum probe states.
arXiv Detail & Related papers (2024-01-11T12:47:29Z) - Prediction-Oriented Bayesian Active Learning [51.426960808684655]
Expected predictive information gain (EPIG) is an acquisition function that measures information gain in the space of predictions rather than parameters.
EPIG leads to stronger predictive performance compared with BALD across a range of datasets and models.
arXiv Detail & Related papers (2023-04-17T10:59:57Z) - Optimization of Annealed Importance Sampling Hyperparameters [77.34726150561087]
Annealed Importance Sampling (AIS) is a popular algorithm used to estimates the intractable marginal likelihood of deep generative models.
We present a parameteric AIS process with flexible intermediary distributions and optimize the bridging distributions to use fewer number of steps for sampling.
We assess the performance of our optimized AIS for marginal likelihood estimation of deep generative models and compare it to other estimators.
arXiv Detail & Related papers (2022-09-27T07:58:25Z) - Sparse high-dimensional linear regression with a partitioned empirical
Bayes ECM algorithm [62.997667081978825]
We propose a computationally efficient and powerful Bayesian approach for sparse high-dimensional linear regression.
Minimal prior assumptions on the parameters are used through the use of plug-in empirical Bayes estimates.
The proposed approach is implemented in the R package probe.
arXiv Detail & Related papers (2022-09-16T19:15:50Z) - Bayesian parameter estimation using Gaussian states and measurements [0.0]
We consider three paradigmatic estimation schemes in continuous-variable quantum metrology.
We investigate the precision achievable with single-mode Gaussian states under homodyne and heterodyne detection.
This allows us to identify Bayesian estimation strategies that combine good performance with the potential for straightforward experimental realization.
arXiv Detail & Related papers (2020-09-08T12:54:12Z) - A machine learning approach to Bayesian parameter estimation [0.0]
We formulate parameter estimation as a classification task and use artificial neural networks to efficiently perform Bayesian estimation.
We show that the network's posterior distribution is centered at the true (unknown) value of the parameter within an uncertainty given by the inverse Fisher information.
arXiv Detail & Related papers (2020-06-03T16:33:21Z) - Estimating Basis Functions in Massive Fields under the Spatial Mixed
Effects Model [8.528384027684194]
For massive datasets, fixed rank kriging using the Expectation-Maximization (EM) algorithm for estimation has been proposed as an alternative to the usual but computationally prohibitive kriging method.
We develop an alternative method that utilizes the Spatial Mixed Effects (SME) model, but allows for additional flexibility by estimating the range of the spatial dependence between the observations and the knots via an Alternating Expectation Conditional Maximization (AECM) algorithm.
Experiments show that our methodology improves estimation without sacrificing prediction accuracy while also minimizing the additional computational burden of extra parameter estimation.
arXiv Detail & Related papers (2020-03-12T19:36:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.