Probabilistic Graybox Characterization of Quantum Devices with Bayesian Neural Networks
- URL: http://arxiv.org/abs/2509.24232v1
- Date: Mon, 29 Sep 2025 03:16:50 GMT
- Title: Probabilistic Graybox Characterization of Quantum Devices with Bayesian Neural Networks
- Authors: Poramet Pathumsoot, Michal HajduĊĦek, Rodney Van Meter,
- Abstract summary: While the Graybox characterization method allows for implicit noise models and is platform-agnostic, the method lacks uncertainty.<n>We develop a probabilistic Graybox characterization model using probabilistic machine learning, specifically Neural Networks, and utilize binary measurement outcomes directly for inference.<n>Our proposed probabilistic Graybox model outperforms the original model by up to 1.9 times in capturing the distribution of observed data.
- Score: 0.15293427903448018
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: While the Graybox characterization method allows for implicit noise models and is platform-agnostic, the method lacks uncertainty quantification. Characterization of quantum devices is a crucial process that enables researchers to gain insight from experimental settings. Graybox characterization combines known system dynamics with unknown transformations, where the latter is modeled using machine learning. Prediction uncertainty helps researchers make informed decisions. It allows valuable insights from the devices without overconfidence. We therefore develop a probabilistic Graybox characterization model using probabilistic machine learning, specifically Bayesian Neural Networks, and utilize binary measurement outcomes directly for inference. With stochastic noise in a quantum device, we analyze statistical properties of the measurement data. Our results show that the model's prediction performance solely depends on its ability to capture the expected value of the true expectation value. Our proposed probabilistic Graybox model outperforms the original model by up to 1.9 times in capturing the distribution of observed data. We expect that our results will serve as an additional tool for characterizing quantum devices with uncertainty estimation, as they provide a flexible choice that can be utilized even without extensive prior knowledge of the noise model of the devices.
Related papers
- Uncertainty Quantification in Probabilistic Machine Learning Models: Theory, Methods, and Insights [24.70625174929573]
Uncertainty Quantification (UQ) is essential for assessing the reliability of predictions.<n>We present a systematic framework for estimating both epistemic and aleatoric uncertainty in probabilistic models.<n>We derive a theoretical formulation for UQ, propose a Monte Carlo sampling-based estimation method, and conduct experiments to evaluate the impact of uncertainty estimation.
arXiv Detail & Related papers (2025-09-07T00:38:33Z) - Graybox characterization and calibration with finite-shot estimation on superconducting-qubit experiments [0.44998333629984877]
We describe an explicit (whitebox) model describing the known dynamics and an implicit (blackbox) model describing the noisy dynamics in the form of a deep neural network.<n>By sending a set of selected pulses to the devices and measuring Pauli expectation values, the Graybox approach can train the implicit model and optimize gates.<n>We benchmark our optimized gates on the devices and cross-testing predictive models with two types of loss functions.
arXiv Detail & Related papers (2025-08-18T11:04:48Z) - Interpretable representation learning of quantum data enabled by probabilistic variational autoencoders [0.5999777817331317]
variational autoencoders (VAEs) have shown promise in extracting the hidden physical features of some input data.<n>VAEs must account for its intrinsic randomness and complex correlations when dealing with quantum data.<n>Here, we demonstrate that two key modifications enable VAEs to learn physically meaningful latent representations.
arXiv Detail & Related papers (2025-06-13T17:39:41Z) - Uncertainty Quantification for Transformer Models for Dark-Pattern Detection [0.21427777919040417]
This study focuses on dark-pattern detection, deceptive design choices that manipulate user decisions, undermining autonomy and consent.<n>We propose a differential fine-tuning approach implemented at the final classification head via uncertainty quantification with transformer-based pre-trained models.
arXiv Detail & Related papers (2024-12-06T18:31:51Z) - Gaussian Mixture Models for Affordance Learning using Bayesian Networks [50.18477618198277]
Affordances are fundamental descriptors of relationships between actions, objects and effects.
This paper approaches the problem of an embodied agent exploring the world and learning these affordances autonomously from its sensory experiences.
arXiv Detail & Related papers (2024-02-08T22:05:45Z) - Quantification of Predictive Uncertainty via Inference-Time Sampling [57.749601811982096]
We propose a post-hoc sampling strategy for estimating predictive uncertainty accounting for data ambiguity.
The method can generate different plausible outputs for a given input and does not assume parametric forms of predictive distributions.
arXiv Detail & Related papers (2023-08-03T12:43:21Z) - Quantum Conformal Prediction for Reliable Uncertainty Quantification in
Quantum Machine Learning [47.991114317813555]
Quantum models implement implicit probabilistic predictors that produce multiple random decisions for each input through measurement shots.
This paper proposes to leverage such randomness to define prediction sets for both classification and regression that provably capture the uncertainty of the model.
arXiv Detail & Related papers (2023-04-06T22:05:21Z) - NUQ: Nonparametric Uncertainty Quantification for Deterministic Neural
Networks [151.03112356092575]
We show the principled way to measure the uncertainty of predictions for a classifier based on Nadaraya-Watson's nonparametric estimate of the conditional label distribution.
We demonstrate the strong performance of the method in uncertainty estimation tasks on a variety of real-world image datasets.
arXiv Detail & Related papers (2022-02-07T12:30:45Z) - Dense Uncertainty Estimation [62.23555922631451]
In this paper, we investigate neural networks and uncertainty estimation techniques to achieve both accurate deterministic prediction and reliable uncertainty estimation.
We work on two types of uncertainty estimations solutions, namely ensemble based methods and generative model based methods, and explain their pros and cons while using them in fully/semi/weakly-supervised framework.
arXiv Detail & Related papers (2021-10-13T01:23:48Z) - Hessian-based toolbox for reliable and interpretable machine learning in
physics [58.720142291102135]
We present a toolbox for interpretability and reliability, extrapolation of the model architecture.
It provides a notion of the influence of the input data on the prediction at a given test point, an estimation of the uncertainty of the model predictions, and an agnostic score for the model predictions.
Our work opens the road to the systematic use of interpretability and reliability methods in ML applied to physics and, more generally, science.
arXiv Detail & Related papers (2021-08-04T16:32:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.