QMetro++ -- Python optimization package for large scale quantum metrology with customized strategy structures
- URL: http://arxiv.org/abs/2506.16524v1
- Date: Thu, 19 Jun 2025 18:13:22 GMT
- Title: QMetro++ -- Python optimization package for large scale quantum metrology with customized strategy structures
- Authors: Piotr Dulian, Stanisław Kurdziałek, Rafał Demkowicz-Dobrzański,
- Abstract summary: QMetro++ is a Python package dedicated to identifying optimal estimation protocols.<n>The package comes with an implementation of the recently developed methods for computing fundamental upper bounds on QFI.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: QMetro++ is a Python package containing a set of tools dedicated to identifying optimal estimation protocols that maximize quantum Fisher information (QFI). Optimization can be performed for an arbitrary arrangement of input states, parameter encoding channels, noise correlations, control operations and measurements. The use of tensor networks and iterative see-saw algorithm allows for an efficient optimization even in the regime of large number of channel uses ($N\approx100$). Additionally, the package comes with an implementation of the recently developed methods for computing fundamental upper bounds on QFI, which serve as benchmarks of optimality of the outcomes of numerical optimization. All functionalities are wrapped up in a user-friendly interface which enables defining strategies at various levels of detail.
Related papers
- Global Optimization of Gaussian Process Acquisition Functions Using a Piecewise-Linear Kernel Approximation [2.3342885570554652]
We introduce a piecewise approximation for process kernels and a corresponding MIQP representation for acquisition functions.
We empirically demonstrate the framework on synthetic functions, constrained benchmarks, and hyper tuning tasks.
arXiv Detail & Related papers (2024-10-22T10:56:52Z) - Federated Conditional Stochastic Optimization [110.513884892319]
Conditional optimization has found in a wide range of machine learning tasks, such as in-variant learning tasks, AUPRC, andAML.
This paper proposes algorithms for distributed federated learning.
arXiv Detail & Related papers (2023-10-04T01:47:37Z) - NUBO: A Transparent Python Package for Bayesian Optimization [0.0]
NUBO is a framework for optimizing black-box functions, such as physical experiments and computer simulators.
It focuses on transparency and user experience to make Bayesian optimization accessible to researchers from all disciplines.
NUBO is written in Python but does not require expert knowledge of Python to optimize simulators and experiments.
arXiv Detail & Related papers (2023-05-11T10:34:27Z) - An Empirical Evaluation of Zeroth-Order Optimization Methods on
AI-driven Molecule Optimization [78.36413169647408]
We study the effectiveness of various ZO optimization methods for optimizing molecular objectives.
We show the advantages of ZO sign-based gradient descent (ZO-signGD)
We demonstrate the potential effectiveness of ZO optimization methods on widely used benchmark tasks from the Guacamol suite.
arXiv Detail & Related papers (2022-10-27T01:58:10Z) - QuanEstimation: An open-source toolkit for quantum parameter estimation [4.648493096183626]
We present a Python-Julia-based open-source toolkit for quantum parameter estimation.
It includes many well-used mathematical bounds and optimization methods.
arXiv Detail & Related papers (2022-05-31T08:02:51Z) - Non-Convex Optimization with Certificates and Fast Rates Through Kernel
Sums of Squares [68.8204255655161]
We consider potentially non- optimized approximation problems.
In this paper, we propose an algorithm that achieves close to optimal a priori computational guarantees.
arXiv Detail & Related papers (2022-04-11T09:37:04Z) - An Optimization Framework for Federated Edge Learning [11.007444733506714]
This paper considers edge computing system where server and workers have possibly different computing and communication capabilities.
We first present a general FL algorithm, namely GenQSGD, parameterized by the numbers of global and local iterations, mini-batch size, and step size sequence.
arXiv Detail & Related papers (2021-11-26T14:47:32Z) - Evolutionary Ensemble Learning for Multivariate Time Series Prediction [6.736731623634526]
A typical pipeline of building an MTS prediction model (PM) consists of selecting a subset of channels among all available ones.
We propose a novel evolutionary ensemble learning framework to optimize the entire pipeline in a holistic manner.
We implement the proposed framework and evaluate our implementation on two real-world applications, i.e., electricity consumption prediction and air quality prediction.
arXiv Detail & Related papers (2021-08-22T07:36:25Z) - Bayesian Algorithm Execution: Estimating Computable Properties of
Black-box Functions Using Mutual Information [78.78486761923855]
In many real world problems, we want to infer some property of an expensive black-box function f, given a budget of T function evaluations.
We present a procedure, InfoBAX, that sequentially chooses queries that maximize mutual information with respect to the algorithm's output.
On these problems, InfoBAX uses up to 500 times fewer queries to f than required by the original algorithm.
arXiv Detail & Related papers (2021-04-19T17:22:11Z) - Adaptive pruning-based optimization of parameterized quantum circuits [62.997667081978825]
Variisy hybrid quantum-classical algorithms are powerful tools to maximize the use of Noisy Intermediate Scale Quantum devices.
We propose a strategy for such ansatze used in variational quantum algorithms, which we call "Efficient Circuit Training" (PECT)
Instead of optimizing all of the ansatz parameters at once, PECT launches a sequence of variational algorithms.
arXiv Detail & Related papers (2020-10-01T18:14:11Z) - Global Optimization of Gaussian processes [52.77024349608834]
We propose a reduced-space formulation with trained Gaussian processes trained on few data points.
The approach also leads to significantly smaller and computationally cheaper sub solver for lower bounding.
In total, we reduce time convergence by orders of orders of the proposed method.
arXiv Detail & Related papers (2020-05-21T20:59:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.