Learning the structure of any Hamiltonian from minimal assumptions
- URL: http://arxiv.org/abs/2410.21635v1
- Date: Tue, 29 Oct 2024 00:43:33 GMT
- Title: Learning the structure of any Hamiltonian from minimal assumptions
- Authors: Andrew Zhao,
- Abstract summary: We study the problem of learning an unknown quantum many-body Hamiltonian $H$ from black-box queries to its time evolution.
We present efficient algorithms to learn any $n$-qubit Hamiltonian, assuming only a bound on the number of Hamiltonian terms.
- Score: 2.810160553339817
- License:
- Abstract: We study the problem of learning an unknown quantum many-body Hamiltonian $H$ from black-box queries to its time evolution $e^{-\mathrm{i} H t}$. Prior proposals for solving this task either impose some assumptions on $H$, such as its interaction structure or locality, or otherwise use an exponential amount of computational postprocessing. In this paper, we present efficient algorithms to learn any $n$-qubit Hamiltonian, assuming only a bound on the number of Hamiltonian terms, $m \leq \mathrm{poly}(n)$. Our algorithms do not need to know the terms in advance, nor are they restricted to local interactions. We consider two models of control over the time evolution: the first has access to time reversal ($t < 0$), enabling an algorithm that outputs an $\epsilon$-accurate classical description of $H$ after querying its dynamics for a total of $\widetilde{O}(m/\epsilon)$ evolution time. The second access model is more conventional, allowing only forward-time evolutions; our algorithm requires $\widetilde{O}(\|H\|^3/\epsilon^4)$ evolution time in this setting. Central to our results is the recently introduced concept of a pseudo-Choi state of $H$. We extend the utility of this learning resource by showing how to use it to learn the Fourier spectrum of $H$, how to achieve nearly Heisenberg-limited scaling with it, and how to prepare it even under our more restricted access models.
Related papers
- Learning $k$-body Hamiltonians via compressed sensing [0.5867838258848337]
We study the problem of learning a $k$-body Hamiltonian with $M$ unknown Pauli terms that are not necessarily geometrically local.
We propose a protocol that learns the Hamiltonian to precision $epsilon$ with total evolution time.
arXiv Detail & Related papers (2024-10-24T17:16:19Z) - Predicting Ground State Properties: Constant Sample Complexity and Deep Learning Algorithms [48.869199703062606]
A fundamental problem in quantum many-body physics is that of finding ground states of local Hamiltonians.
We introduce two approaches that achieve a constant sample complexity, independent of system size $n$, for learning ground state properties.
arXiv Detail & Related papers (2024-05-28T18:00:32Z) - Structure learning of Hamiltonians from real-time evolution [22.397920564324973]
We present a new, general approach to Hamiltonian learning that not only solves the challenging structure learning variant, but also resolves other open questions in the area.
Our algorithm recovers the Hamiltonian to $varepsilon$ error with total evolution time $O(log (n)/varepsilon)$, and has the following appealing properties.
As an application, we can also learn Hamiltonians exhibiting power-law decay up to accuracy $varepsilon$ with total evolution time beating the standard limit of $1/varepsilon2$.
arXiv Detail & Related papers (2024-04-30T18:00:00Z) - A polynomial-time dissipation-based quantum algorithm for solving the ground states of a class of classically hard Hamiltonians [4.500918096201963]
We give a quantum algorithm for solving the ground states of a class of Hamiltonians.
The mechanism of the exponential speedup that appeared in our algorithm comes from dissipation in open quantum systems.
arXiv Detail & Related papers (2024-01-25T05:01:02Z) - Learning many-body Hamiltonians with Heisenberg-limited scaling [3.460138063155115]
We propose the first algorithm to achieve the Heisenberg limit for learning interacting $N$-qubit local Hamiltonian.
After a total evolution time of $mathcalO(epsilon-1)$, the proposed algorithm can efficiently estimate any parameter in the $N$-qubit Hamiltonian to $epsilon$-error with high probability.
arXiv Detail & Related papers (2022-10-06T16:30:51Z) - Cryptographic Hardness of Learning Halfspaces with Massart Noise [59.8587499110224]
We study the complexity of PAC learning halfspaces in the presence of Massart noise.
We show that no-time Massart halfspace learners can achieve error better than $Omega(eta)$, even if the optimal 0-1 error is small.
arXiv Detail & Related papers (2022-07-28T17:50:53Z) - Quantum Resources Required to Block-Encode a Matrix of Classical Data [56.508135743727934]
We provide circuit-level implementations and resource estimates for several methods of block-encoding a dense $Ntimes N$ matrix of classical data to precision $epsilon$.
We examine resource tradeoffs between the different approaches and explore implementations of two separate models of quantum random access memory (QRAM)
Our results go beyond simple query complexity and provide a clear picture into the resource costs when large amounts of classical data are assumed to be accessible to quantum algorithms.
arXiv Detail & Related papers (2022-06-07T18:00:01Z) - Threshold Phenomena in Learning Halfspaces with Massart Noise [56.01192577666607]
We study the problem of PAC learning halfspaces on $mathbbRd$ with Massart noise under Gaussian marginals.
Our results qualitatively characterize the complexity of learning halfspaces in the Massart model.
arXiv Detail & Related papers (2021-08-19T16:16:48Z) - Optimal learning of quantum Hamiltonians from high-temperature Gibbs
states [0.9453554184019105]
We show how to learn the coefficients of a Hamiltonian to error $varepsilon$ with sample complexity $S = O(log N/(betavarepsilon)2)$ and time linear in the sample size, $O(S N)$.
In the appendix, we show virtually the same algorithm can be used to learn $H$ from a real-time evolution unitary $e-it Hilon in a small $t regime with similar sample and time complexity.
arXiv Detail & Related papers (2021-08-10T18:00:49Z) - Hardness of Learning Halfspaces with Massart Noise [56.98280399449707]
We study the complexity of PAC learning halfspaces in the presence of Massart (bounded) noise.
We show that there an exponential gap between the information-theoretically optimal error and the best error that can be achieved by a SQ algorithm.
arXiv Detail & Related papers (2020-12-17T16:43:11Z) - Model-Free Reinforcement Learning: from Clipped Pseudo-Regret to Sample
Complexity [59.34067736545355]
Given an MDP with $S$ states, $A$ actions, the discount factor $gamma in (0,1)$, and an approximation threshold $epsilon > 0$, we provide a model-free algorithm to learn an $epsilon$-optimal policy.
For small enough $epsilon$, we show an improved algorithm with sample complexity.
arXiv Detail & Related papers (2020-06-06T13:34:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.