Memory-Sample Lower Bounds for Learning with Classical-Quantum Hybrid
Memory
- URL: http://arxiv.org/abs/2303.00209v1
- Date: Wed, 1 Mar 2023 03:22:26 GMT
- Title: Memory-Sample Lower Bounds for Learning with Classical-Quantum Hybrid
Memory
- Authors: Qipeng Liu, Ran Raz, Wei Zhan
- Abstract summary: We prove that any quantum algorithm with both, classical memory and quantum memory, requires either $Omega(n2)$ bits of classical memory or $Omega(n)$ bits of quantum memory or an exponential number of samples.
Our results refute the possibility that a small amount of quantum memory significantly reduces the size of classical memory needed for efficient learning on these problems.
- Score: 9.615949949517901
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In a work by Raz (J. ACM and FOCS 16), it was proved that any algorithm for
parity learning on $n$ bits requires either $\Omega(n^2)$ bits of classical
memory or an exponential number (in~$n$) of random samples. A line of recent
works continued that research direction and showed that for a large collection
of classical learning tasks, either super-linear classical memory size or
super-polynomially many samples are needed. However, these results do not
capture all physical computational models, remarkably, quantum computers and
the use of quantum memory. It leaves the possibility that a small piece of
quantum memory could significantly reduce the need for classical memory or
samples and thus completely change the nature of the classical learning task.
In this work, we prove that any quantum algorithm with both, classical memory
and quantum memory, for parity learning on $n$ bits, requires either
$\Omega(n^2)$ bits of classical memory or $\Omega(n)$ bits of quantum memory or
an exponential number of samples. In other words, the memory-sample lower bound
for parity learning remains qualitatively the same, even if the learning
algorithm can use, in addition to the classical memory, a quantum memory of
size $c n$ (for some constant $c>0$).
Our results refute the possibility that a small amount of quantum memory
significantly reduces the size of classical memory needed for efficient
learning on these problems. Our results also imply improved security of several
existing cryptographical protocols in the bounded-storage model (protocols that
are based on parity learning on $n$ bits), proving that security holds even in
the presence of a quantum adversary with at most $c n^2$ bits of classical
memory and $c n$ bits of quantum memory (for some constant $c>0$).
Related papers
- Dimension Independent and Computationally Efficient Shadow Tomography [0.0]
We describe a new shadow tomography algorithm that uses $n=Theta(sqrtmlog m/epsilon2)$ samples, for $m$ measurements and additive error $epsilon$.
This stands in contrast to all previously known algorithms that improve upon the naive approach.
arXiv Detail & Related papers (2024-11-03T03:07:35Z) - Exponential learning advantages with conjugate states and minimal
quantum memory [0.0]
We investigate a new learning resource which could be available to quantum computers in the future.
For a certain shadow tomography task, we find that measurements on only copies of $rho otimes rhoast$ can be exponentially more powerful than measurements on $rhootimes K$.
We believe the advantage may find applications in improving quantum simulation, learning from quantum sensors, and uncovering new physical phenomena.
arXiv Detail & Related papers (2024-03-06T05:04:45Z) - Towards large-scale quantum optimization solvers with few qubits [59.63282173947468]
We introduce a variational quantum solver for optimizations over $m=mathcalO(nk)$ binary variables using only $n$ qubits, with tunable $k>1$.
We analytically prove that the specific qubit-efficient encoding brings in a super-polynomial mitigation of barren plateaus as a built-in feature.
arXiv Detail & Related papers (2024-01-17T18:59:38Z) - Cumulative Memory Lower Bounds for Randomized and Quantum Computation [1.52292571922932]
Cumulative memory is a measure of time-space complexity.
We prove the first lower bounds on cumulative memory complexity for both sequential classical computation and quantum circuits.
arXiv Detail & Related papers (2023-01-13T17:57:02Z) - Quantum Resources Required to Block-Encode a Matrix of Classical Data [56.508135743727934]
We provide circuit-level implementations and resource estimates for several methods of block-encoding a dense $Ntimes N$ matrix of classical data to precision $epsilon$.
We examine resource tradeoffs between the different approaches and explore implementations of two separate models of quantum random access memory (QRAM)
Our results go beyond simple query complexity and provide a clear picture into the resource costs when large amounts of classical data are assumed to be accessible to quantum algorithms.
arXiv Detail & Related papers (2022-06-07T18:00:01Z) - Entanglement and coherence in Bernstein-Vazirani algorithm [58.720142291102135]
Bernstein-Vazirani algorithm allows one to determine a bit string encoded into an oracle.
We analyze in detail the quantum resources in the Bernstein-Vazirani algorithm.
We show that in the absence of entanglement, the performance of the algorithm is directly related to the amount of quantum coherence in the initial state.
arXiv Detail & Related papers (2022-05-26T20:32:36Z) - Memory Compression with Quantum Random-Access Gates [0.0]
We show an analogous result for quantum algorithms equipped with quantum random-access gates.
It is often possible to construct a space-inefficient, yet sparse, quantum data structure.
arXiv Detail & Related papers (2022-03-10T19:27:53Z) - Exponential separations between learning with and without quantum memory [17.763817187554096]
We study the power of quantum memory for learning properties of quantum systems and dynamics.
Many state-of-the-art learning algorithms require access to an additional external quantum memory.
We show that this trade-off is inherent in a wide range of learning problems.
arXiv Detail & Related papers (2021-11-10T19:03:49Z) - Kanerva++: extending The Kanerva Machine with differentiable, locally
block allocated latent memory [75.65949969000596]
Episodic and semantic memory are critical components of the human memory model.
We develop a new principled Bayesian memory allocation scheme that bridges the gap between episodic and semantic memory.
We demonstrate that this allocation scheme improves performance in memory conditional image generation.
arXiv Detail & Related papers (2021-02-20T18:40:40Z) - Quantum Gram-Schmidt Processes and Their Application to Efficient State
Read-out for Quantum Algorithms [87.04438831673063]
We present an efficient read-out protocol that yields the classical vector form of the generated state.
Our protocol suits the case that the output state lies in the row space of the input matrix.
One of our technical tools is an efficient quantum algorithm for performing the Gram-Schmidt orthonormal procedure.
arXiv Detail & Related papers (2020-04-14T11:05:26Z) - Quantum Coupon Collector [62.58209964224025]
We study how efficiently a $k$-element set $Ssubseteq[n]$ can be learned from a uniform superposition $|Srangle of its elements.
We give tight bounds on the number of quantum samples needed for every $k$ and $n$, and we give efficient quantum learning algorithms.
arXiv Detail & Related papers (2020-02-18T16:14:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.