Contextuality and inductive bias in quantum machine learning
- URL: http://arxiv.org/abs/2302.01365v3
- Date: Tue, 18 Apr 2023 08:15:59 GMT
- Title: Contextuality and inductive bias in quantum machine learning
- Authors: Joseph Bowles, Victoria J Wright, M\'at\'e Farkas, Nathan Killoran,
Maria Schuld
- Abstract summary: Generalisation in machine learning often relies on the ability to encode structures present in data into an inductive bias of the model class.
We look at quantum contextuality -- a form of nonclassicality with links to computational advantage.
We show how to construct quantum learning models with the associated inductive bias, and show through our toy problem that they outperform their corresponding classical surrogate models.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Generalisation in machine learning often relies on the ability to encode
structures present in data into an inductive bias of the model class. To
understand the power of quantum machine learning, it is therefore crucial to
identify the types of data structures that lend themselves naturally to quantum
models. In this work we look to quantum contextuality -- a form of
nonclassicality with links to computational advantage -- for answers to this
question. We introduce a framework for studying contextuality in machine
learning, which leads us to a definition of what it means for a learning model
to be contextual. From this, we connect a central concept of contextuality,
called operational equivalence, to the ability of a model to encode a linearly
conserved quantity in its label space. A consequence of this connection is that
contextuality is tied to expressivity: contextual model classes that encode the
inductive bias are generally more expressive than their noncontextual
counterparts. To demonstrate this, we construct an explicit toy learning
problem -- based on learning the payoff behaviour of a zero-sum game -- for
which this is the case. By leveraging tools from geometric quantum machine
learning, we then describe how to construct quantum learning models with the
associated inductive bias, and show through our toy problem that they
outperform their corresponding classical surrogate models. This suggests that
understanding learning problems of this form may lead to useful insights about
the power of quantum machine learning.
Related papers
- Hybrid Quantum-Classical Machine Learning with String Diagrams [49.1574468325115]
This paper develops a formal framework for describing hybrid algorithms in terms of string diagrams.
A notable feature of our string diagrams is the use of functor boxes, which correspond to a quantum-classical interfaces.
arXiv Detail & Related papers (2024-07-04T06:37:16Z) - Binding Dynamics in Rotating Features [72.80071820194273]
We propose an alternative "cosine binding" mechanism, which explicitly computes the alignment between features and adjusts weights accordingly.
This allows us to draw direct connections to self-attention and biological neural processes, and to shed light on the fundamental dynamics for object-centric representations to emerge in Rotating Features.
arXiv Detail & Related papers (2024-02-08T12:31:08Z) - Synergy of machine learning with quantum computing and communication [0.0]
Machine learning in quantum computing and communication provides opportunities for revolutionizing the field of Physics, Mathematics, and Computer Science.
This paper gives a comprehensive review of state-of-the-art approaches in quantum computing and quantum communication in the context of Artificial Intelligence and machine learning models.
arXiv Detail & Related papers (2023-10-05T10:18:39Z) - ShadowNet for Data-Centric Quantum System Learning [188.683909185536]
We propose a data-centric learning paradigm combining the strength of neural-network protocols and classical shadows.
Capitalizing on the generalization power of neural networks, this paradigm can be trained offline and excel at predicting previously unseen systems.
We present the instantiation of our paradigm in quantum state tomography and direct fidelity estimation tasks and conduct numerical analysis up to 60 qubits.
arXiv Detail & Related papers (2023-08-22T09:11:53Z) - Exponential separations between classical and quantum learners [0.0]
We discuss how subtle differences in definitions can result in significantly different requirements and tasks for the learner to meet and solve.
We present two new learning separations where the classical difficulty primarily lies in identifying the function generating the data.
arXiv Detail & Related papers (2023-06-28T08:55:56Z) - Explainable Representation Learning of Small Quantum States [0.0]
We train a generative model on two-qubit density matrices generated by a parameterized quantum circuit.
We observe that the model learns an interpretable representation which relates the quantum states to their underlying entanglement characteristics.
Our approach offers insight into how machines learn to represent small-scale quantum systems autonomously.
arXiv Detail & Related papers (2023-06-09T06:30:25Z) - The Quantum Path Kernel: a Generalized Quantum Neural Tangent Kernel for
Deep Quantum Machine Learning [52.77024349608834]
Building a quantum analog of classical deep neural networks represents a fundamental challenge in quantum computing.
Key issue is how to address the inherent non-linearity of classical deep learning.
We introduce the Quantum Path Kernel, a formulation of quantum machine learning capable of replicating those aspects of deep machine learning.
arXiv Detail & Related papers (2022-12-22T16:06:24Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - On establishing learning separations between classical and quantum
machine learning with classical data [0.0]
We discuss the challenges of finding learning problems that quantum learning algorithms can learn much faster than any classical learning algorithm.
We study existing learning problems with a provable quantum speedup to distill sets of more general and sufficient conditions.
These checklists are intended to streamline one's approach to proving quantum speedups for learning problems, or to elucidate bottlenecks.
arXiv Detail & Related papers (2022-08-12T16:00:30Z) - Neuro-symbolic Architectures for Context Understanding [59.899606495602406]
We propose the use of hybrid AI methodology as a framework for combining the strengths of data-driven and knowledge-driven approaches.
Specifically, we inherit the concept of neuro-symbolism as a way of using knowledge-bases to guide the learning progress of deep neural networks.
arXiv Detail & Related papers (2020-03-09T15:04:07Z) - Quantum Adversarial Machine Learning [0.0]
Adrial machine learning is an emerging field that focuses on studying vulnerabilities of machine learning approaches in adversarial settings.
In this paper, we explore different adversarial scenarios in the context of quantum machine learning.
We find that a quantum classifier that achieves nearly the state-of-the-art accuracy can be conclusively deceived by adversarial examples.
arXiv Detail & Related papers (2019-12-31T19:00:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.