Interpretable Quantum Advantage in Neural Sequence Learning
- URL: http://arxiv.org/abs/2209.14353v1
- Date: Wed, 28 Sep 2022 18:34:04 GMT
- Title: Interpretable Quantum Advantage in Neural Sequence Learning
- Authors: Eric R. Anschuetz and Hong-Ye Hu and Jin-Long Huang and Xun Gao
- Abstract summary: We study the relative expressive power between a broad class of neural network sequence models and a class of recurrent models based on Gaussian operations with non-Gaussian measurements.
We pinpoint quantum contextuality as the source of an unconditional memory separation in the expressivity of the two model classes.
In doing so, we demonstrate that our introduced quantum models are able to outperform state of the art classical models even in practice.
- Score: 2.575030923243061
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Quantum neural networks have been widely studied in recent years, given their
potential practical utility and recent results regarding their ability to
efficiently express certain classical data. However, analytic results to date
rely on assumptions and arguments from complexity theory. Due to this, there is
little intuition as to the source of the expressive power of quantum neural
networks or for which classes of classical data any advantage can be reasonably
expected to hold. Here, we study the relative expressive power between a broad
class of neural network sequence models and a class of recurrent models based
on Gaussian operations with non-Gaussian measurements. We explicitly show that
quantum contextuality is the source of an unconditional memory separation in
the expressivity of the two model classes. Additionally, as we are able to
pinpoint quantum contextuality as the source of this separation, we use this
intuition to study the relative performance of our introduced model on a
standard translation data set exhibiting linguistic contextuality. In doing so,
we demonstrate that our introduced quantum models are able to outperform state
of the art classical models even in practice.
Related papers
- When can classical neural networks represent quantum states? [0.24749083496491683]
A naive classical representation of an n-qubit state requires specifying exponentially many amplitudes in the computational basis.
Past works have demonstrated that classical neural networks can succinctly express these amplitudes for many physically relevant states.
We show that conditional correlations present in the measurement distribution of quantum states control the performance of their neural representations.
arXiv Detail & Related papers (2024-10-30T16:06:53Z) - Arbitrary Polynomial Separations in Trainable Quantum Machine Learning [1.0080317855851213]
Recent theoretical results in quantum machine learning have demonstrated a general trade-off between the expressive power of quantum neural networks (QNNs) and their trainability.
We here circumvent these negative results by constructing a hierarchy of efficiently train QNNs that exhibit unconditionally provable, memory separations.
We show that quantum contextuality is the source of the expressivity separation, suggesting that other classical sequence learning problems with long-time correlations may be a regime where practical advantages in quantum machine learning may exist.
arXiv Detail & Related papers (2024-02-13T17:12:01Z) - Exponential Quantum Communication Advantage in Distributed Inference and Learning [19.827903766111987]
We present a framework for distributed computation over a quantum network.
We show that for models within this framework, inference and training using gradient descent can be performed with exponentially less communication.
We also show that models in this class can encode highly nonlinear features of their inputs, and their expressivity increases exponentially with model depth.
arXiv Detail & Related papers (2023-10-11T02:19:50Z) - ShadowNet for Data-Centric Quantum System Learning [188.683909185536]
We propose a data-centric learning paradigm combining the strength of neural-network protocols and classical shadows.
Capitalizing on the generalization power of neural networks, this paradigm can be trained offline and excel at predicting previously unseen systems.
We present the instantiation of our paradigm in quantum state tomography and direct fidelity estimation tasks and conduct numerical analysis up to 60 qubits.
arXiv Detail & Related papers (2023-08-22T09:11:53Z) - A Framework for Demonstrating Practical Quantum Advantage: Racing
Quantum against Classical Generative Models [62.997667081978825]
We build over a proposed framework for evaluating the generalization performance of generative models.
We establish the first comparative race towards practical quantum advantage (PQA) between classical and quantum generative models.
Our results suggest that QCBMs are more efficient in the data-limited regime than the other state-of-the-art classical generative models.
arXiv Detail & Related papers (2023-03-27T22:48:28Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - Generalization Metrics for Practical Quantum Advantage in Generative
Models [68.8204255655161]
Generative modeling is a widely accepted natural use case for quantum computers.
We construct a simple and unambiguous approach to probe practical quantum advantage for generative modeling by measuring the algorithm's generalization performance.
Our simulation results show that our quantum-inspired models have up to a $68 times$ enhancement in generating unseen unique and valid samples.
arXiv Detail & Related papers (2022-01-21T16:35:35Z) - The dilemma of quantum neural networks [63.82713636522488]
We show that quantum neural networks (QNNs) fail to provide any benefit over classical learning models.
QNNs suffer from the severely limited effective model capacity, which incurs poor generalization on real-world datasets.
These results force us to rethink the role of current QNNs and to design novel protocols for solving real-world problems with quantum advantages.
arXiv Detail & Related papers (2021-06-09T10:41:47Z) - Quantum Self-Supervised Learning [22.953284192004034]
We propose a hybrid quantum-classical neural network architecture for contrastive self-supervised learning.
We apply our best quantum model to classify unseen images on the ibmq_paris quantum computer.
arXiv Detail & Related papers (2021-03-26T18:00:00Z) - Enhancing Generative Models via Quantum Correlations [1.6099403809839032]
Generative modeling using samples drawn from the probability distribution constitutes a powerful approach for unsupervised machine learning.
We show theoretically that such quantum correlations provide a powerful resource for generative modeling.
We numerically test this separation on standard machine learning data sets and show that it holds for practical problems.
arXiv Detail & Related papers (2021-01-20T22:57:22Z) - Parsimonious neural networks learn interpretable physical laws [77.34726150561087]
We propose parsimonious neural networks (PNNs) that combine neural networks with evolutionary optimization to find models that balance accuracy with parsimony.
The power and versatility of the approach is demonstrated by developing models for classical mechanics and to predict the melting temperature of materials from fundamental properties.
arXiv Detail & Related papers (2020-05-08T16:15:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.