Quantum statistical query learning
- URL: http://arxiv.org/abs/2002.08240v2
- Date: Tue, 24 Nov 2020 21:26:33 GMT
- Title: Quantum statistical query learning
- Authors: Srinivasan Arunachalam, Alex B. Grilo, Henry Yuen
- Abstract summary: We propose a learning model called the quantum statistical learning QSQ model.
Our model can be seen as a restriction of the quantum PAC learning model.
We show that learnability in QSQ model implies learnability in quantum private PAC model.
- Score: 6.434862450258459
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a learning model called the quantum statistical learning QSQ
model, which extends the SQ learning model introduced by Kearns to the quantum
setting. Our model can be also seen as a restriction of the quantum PAC
learning model: here, the learner does not have direct access to quantum
examples, but can only obtain estimates of measurement statistics on them.
Theoretically, this model provides a simple yet expressive setting to explore
the power of quantum examples in machine learning. From a practical
perspective, since simpler operations are required, learning algorithms in the
QSQ model are more feasible for implementation on near-term quantum devices. We
prove a number of results about the QSQ learning model. We first show that
parity functions, (log n)-juntas and polynomial-sized DNF formulas are
efficiently learnable in the QSQ model, in contrast to the classical setting
where these problems are provably hard. This implies that many of the
advantages of quantum PAC learning can be realized even in the more restricted
quantum SQ learning model. It is well-known that weak statistical query
dimension, denoted by WSQDIM(C), characterizes the complexity of learning a
concept class C in the classical SQ model. We show that log(WSQDIM(C)) is a
lower bound on the complexity of QSQ learning, and furthermore it is tight for
certain concept classes C. Additionally, we show that this quantity provides
strong lower bounds for the small-bias quantum communication model under
product distributions. Finally, we introduce the notion of private quantum PAC
learning, in which a quantum PAC learner is required to be differentially
private. We show that learnability in the QSQ model implies learnability in the
quantum private PAC model. Additionally, we show that in the private PAC
learning setting, the classical and quantum sample complexities are equal, up
to constant factors.
Related papers
- Efficient Learning for Linear Properties of Bounded-Gate Quantum Circuits [63.733312560668274]
Given a quantum circuit containing d tunable RZ gates and G-d Clifford gates, can a learner perform purely classical inference to efficiently predict its linear properties?
We prove that the sample complexity scaling linearly in d is necessary and sufficient to achieve a small prediction error, while the corresponding computational complexity may scale exponentially in d.
We devise a kernel-based learning model capable of trading off prediction error and computational complexity, transitioning from exponential to scaling in many practical settings.
arXiv Detail & Related papers (2024-08-22T08:21:28Z) - A Quantum Leaky Integrate-and-Fire Spiking Neuron and Network [0.0]
We introduce a new software model for quantum neuromorphic computing.
We use these neurons as building blocks in the construction of a quantum spiking neural network (QSNN) and a quantum spiking convolutional neural network (QSCNN)
arXiv Detail & Related papers (2024-07-23T11:38:06Z) - Single-shot quantum machine learning [0.3277163122167433]
We analyze when quantum learning models can produce predictions in a near-deterministic way.
We show that the degree to which a quantum learning model is near-deterministic is constrained by the distinguishability of the embedded quantum states.
We conclude by showing that quantum learning models cannot be single-shot in a generic way and trainable at the same time.
arXiv Detail & Related papers (2024-06-19T20:17:18Z) - Separable Power of Classical and Quantum Learning Protocols Through the Lens of No-Free-Lunch Theorem [70.42372213666553]
The No-Free-Lunch (NFL) theorem quantifies problem- and data-independent generalization errors regardless of the optimization process.
We categorize a diverse array of quantum learning algorithms into three learning protocols designed for learning quantum dynamics under a specified observable.
Our derived NFL theorems demonstrate quadratic reductions in sample complexity across CLC-LPs, ReQu-LPs, and Qu-LPs.
We attribute this performance discrepancy to the unique capacity of quantum-related learning protocols to indirectly utilize information concerning the global phases of non-orthogonal quantum states.
arXiv Detail & Related papers (2024-05-12T09:05:13Z) - Quantum Inception Score [0.39102514525861415]
We propose the quantum inception score (qIS) for quantum generators.
QIS relates the quality to the Holevo information of the quantum channel that classifies a given dataset.
We apply qIS to assess the quality of the one-dimensional spin chain model as a quantum generative model.
arXiv Detail & Related papers (2023-11-20T20:28:30Z) - Unifying (Quantum) Statistical and Parametrized (Quantum) Algorithms [65.268245109828]
We take inspiration from Kearns' SQ oracle and Valiant's weak evaluation oracle.
We introduce an extensive yet intuitive framework that yields unconditional lower bounds for learning from evaluation queries.
arXiv Detail & Related papers (2023-10-26T18:23:21Z) - TeD-Q: a tensor network enhanced distributed hybrid quantum machine
learning framework [59.07246314484875]
TeD-Q is an open-source software framework for quantum machine learning.
It seamlessly integrates classical machine learning libraries with quantum simulators.
It provides a graphical mode in which the quantum circuit and the training progress can be visualized in real-time.
arXiv Detail & Related papers (2023-01-13T09:35:05Z) - Recent Advances for Quantum Neural Networks in Generative Learning [98.88205308106778]
Quantum generative learning models (QGLMs) may surpass their classical counterparts.
We review the current progress of QGLMs from the perspective of machine learning.
We discuss the potential applications of QGLMs in both conventional machine learning tasks and quantum physics.
arXiv Detail & Related papers (2022-06-07T07:32:57Z) - Theory of Quantum Generative Learning Models with Maximum Mean
Discrepancy [67.02951777522547]
We study learnability of quantum circuit Born machines (QCBMs) and quantum generative adversarial networks (QGANs)
We first analyze the generalization ability of QCBMs and identify their superiorities when the quantum devices can directly access the target distribution.
Next, we prove how the generalization error bound of QGANs depends on the employed Ansatz, the number of qudits, and input states.
arXiv Detail & Related papers (2022-05-10T08:05:59Z) - Private learning implies quantum stability [5.435793655810987]
Learning an unknown $n$-qubit quantum state $rho$ is a fundamental challenge in quantum computing.
We prove a sequence of (information-theoretic) implications from differentially-private PAC learning, to communication complexity, to online learning and then to quantum stability.
arXiv Detail & Related papers (2021-02-14T15:13:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.