Estimating Trotter Approximation Errors to Optimize Hamiltonian
Partitioning for Lower Eigenvalue Errors
- URL: http://arxiv.org/abs/2312.13282v2
- Date: Mon, 1 Jan 2024 20:48:32 GMT
- Title: Estimating Trotter Approximation Errors to Optimize Hamiltonian
Partitioning for Lower Eigenvalue Errors
- Authors: Luis A. Mart\'inez-Mart\'inez, Prathami Divakar Kamath and Artur F.
Izmaylov
- Abstract summary: Trotter approximation error estimation based on perturbation theory up to a second order in the time-step for eigenvalues provides estimates with very good correlations with the Trotter approximation errors.
The developed perturbative estimates can be used for practical time-step and Hamiltonian partitioning selection protocols.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: One of the ways to encode many-body Hamiltonians on a quantum computer to
obtain their eigen-energies through Quantum Phase Estimation is by means of the
Trotter approximation. There were several ways proposed to assess the quality
of this approximation based on estimating the norm of the difference between
the exact and approximate evolution operators. Here, we would like to explore
how these different error estimates are correlated with each other and whether
they can be good predictors for the true Trotter approximation error in finding
eigenvalues. For a set of small molecular systems we calculated the exact
Trotter approximation errors of the first order Trotter formulas for the ground
state electronic energies. Comparison of these errors with previously used
upper bounds show almost no correlation over the systems and various
Hamiltonian partitionings. On the other hand, building the Trotter
approximation error estimation based on perturbation theory up to a second
order in the time-step for eigenvalues provides estimates with very good
correlations with the Trotter approximation errors. The developed perturbative
estimates can be used for practical time-step and Hamiltonian partitioning
selection protocols, which are paramount for an accurate assessment of
resources needed for the estimation of energy eigenvalues under a target
accuracy.
Related papers
- Semiparametric conformal prediction [79.6147286161434]
Risk-sensitive applications require well-calibrated prediction sets over multiple, potentially correlated target variables.
We treat the scores as random vectors and aim to construct the prediction set accounting for their joint correlation structure.
We report desired coverage and competitive efficiency on a range of real-world regression problems.
arXiv Detail & Related papers (2024-11-04T14:29:02Z) - Multivariate root-n-consistent smoothing parameter free matching estimators and estimators of inverse density weighted expectations [51.000851088730684]
We develop novel modifications of nearest-neighbor and matching estimators which converge at the parametric $sqrt n $-rate.
We stress that our estimators do not involve nonparametric function estimators and in particular do not rely on sample-size dependent parameters smoothing.
arXiv Detail & Related papers (2024-07-11T13:28:34Z) - A Parameter-Free Two-Bit Covariance Estimator with Improved Operator
Norm Error Rate [27.308933056578212]
We propose a new 2-bit covariance matrix estimator that simultaneously addresses both issues.
By employing dithering scales varying across entries, our estimator enjoys an improved operator norm error rate.
Our proposed method eliminates the need of any tuning parameter, as the dithering scales are entirely determined by the data.
arXiv Detail & Related papers (2023-08-30T14:31:24Z) - Leveraging Variational Autoencoders for Parameterized MMSE Estimation [10.141454378473972]
We propose a variational autoencoder-based framework for parameterizing a conditional linear minimum mean squared error estimator.
The derived estimator is shown to approximate the minimum mean squared error estimator by utilizing the variational autoencoder as a generative prior for the estimation problem.
We conduct a rigorous analysis by bounding the difference between the proposed and the minimum mean squared error estimator.
arXiv Detail & Related papers (2023-07-11T15:41:34Z) - How Good are Low-Rank Approximations in Gaussian Process Regression? [28.392890577684657]
We provide guarantees for approximate Gaussian Process (GP) regression resulting from two common low-rank kernel approximations.
We provide experiments on both simulated data and standard benchmarks to evaluate the effectiveness of our theoretical bounds.
arXiv Detail & Related papers (2021-12-13T04:04:08Z) - A Partially Random Trotter Algorithm for Quantum Hamiltonian Simulations [31.761854762513337]
Given the Hamiltonian, the evaluation of unitary operators has been at the heart of many quantum algorithms.
Motivated by existing deterministic and random methods, we present a hybrid approach.
arXiv Detail & Related papers (2021-09-16T13:53:12Z) - Rao-Blackwellizing the Straight-Through Gumbel-Softmax Gradient
Estimator [93.05919133288161]
We show that the variance of the straight-through variant of the popular Gumbel-Softmax estimator can be reduced through Rao-Blackwellization.
This provably reduces the mean squared error.
We empirically demonstrate that this leads to variance reduction, faster convergence, and generally improved performance in two unsupervised latent variable models.
arXiv Detail & Related papers (2020-10-09T22:54:38Z) - Nonparametric Score Estimators [49.42469547970041]
Estimating the score from a set of samples generated by an unknown distribution is a fundamental task in inference and learning of probabilistic models.
We provide a unifying view of these estimators under the framework of regularized nonparametric regression.
We propose score estimators based on iterative regularization that enjoy computational benefits from curl-free kernels and fast convergence.
arXiv Detail & Related papers (2020-05-20T15:01:03Z) - Machine learning for causal inference: on the use of cross-fit
estimators [77.34726150561087]
Doubly-robust cross-fit estimators have been proposed to yield better statistical properties.
We conducted a simulation study to assess the performance of several estimators for the average causal effect (ACE)
When used with machine learning, the doubly-robust cross-fit estimators substantially outperformed all of the other estimators in terms of bias, variance, and confidence interval coverage.
arXiv Detail & Related papers (2020-04-21T23:09:55Z) - How Good are Low-Rank Approximations in Gaussian Process Regression? [24.09582049403961]
We provide guarantees for approximate Gaussian Process (GP) regression resulting from two common low-rank kernel approximations.
We provide experiments on both simulated data and standard benchmarks to evaluate the effectiveness of our theoretical bounds.
arXiv Detail & Related papers (2020-04-03T14:15:10Z) - Minimax Optimal Estimation of KL Divergence for Continuous Distributions [56.29748742084386]
Esting Kullback-Leibler divergence from identical and independently distributed samples is an important problem in various domains.
One simple and effective estimator is based on the k nearest neighbor between these samples.
arXiv Detail & Related papers (2020-02-26T16:37:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.