Quantum Self-Supervised Learning
- URL: http://arxiv.org/abs/2103.14653v1
- Date: Fri, 26 Mar 2021 18:00:00 GMT
- Title: Quantum Self-Supervised Learning
- Authors: Ben Jaderberg, Lewis W. Anderson, Weidi Xie, Samuel Albanie, Martin
Kiffner, Dieter Jaksch
- Abstract summary: We propose a hybrid quantum-classical neural network architecture for contrastive self-supervised learning.
We apply our best quantum model to classify unseen images on the ibmq_paris quantum computer.
- Score: 22.953284192004034
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The popularisation of neural networks has seen incredible advances in pattern
recognition, driven by the supervised learning of human annotations. However,
this approach is unsustainable in relation to the dramatically increasing size
of real-world datasets. This has led to a resurgence in self-supervised
learning, a paradigm whereby the model generates its own supervisory signal
from the data. Here we propose a hybrid quantum-classical neural network
architecture for contrastive self-supervised learning and test its
effectiveness in proof-of-principle experiments. Interestingly, we observe a
numerical advantage for the learning of visual representations using
small-scale quantum neural networks over equivalently structured classical
networks, even when the quantum circuits are sampled with only 100 shots.
Furthermore, we apply our best quantum model to classify unseen images on the
ibmq_paris quantum computer and find that current noisy devices can already
achieve equal accuracy to the equivalent classical model on downstream tasks.
Related papers
- Bridging Classical and Quantum Machine Learning: Knowledge Transfer From
Classical to Quantum Neural Networks Using Knowledge Distillation [0.0]
This paper introduces a new method to transfer knowledge from classical to quantum neural networks using knowledge distillation.
We adapt classical convolutional neural network (CNN) architectures like LeNet and AlexNet to serve as teacher networks.
Quantum models achieve an average accuracy improvement of 0.80% on the MNIST dataset and 5.40% on the more complex Fashion MNIST dataset.
arXiv Detail & Related papers (2023-11-23T05:06:43Z) - ShadowNet for Data-Centric Quantum System Learning [188.683909185536]
We propose a data-centric learning paradigm combining the strength of neural-network protocols and classical shadows.
Capitalizing on the generalization power of neural networks, this paradigm can be trained offline and excel at predicting previously unseen systems.
We present the instantiation of our paradigm in quantum state tomography and direct fidelity estimation tasks and conduct numerical analysis up to 60 qubits.
arXiv Detail & Related papers (2023-08-22T09:11:53Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Deep learning of many-body observables and quantum information scrambling [0.0]
We explore how the capacity of data-driven deep neural networks in learning the dynamics of physical observables is correlated with the scrambling of quantum information.
We train a neural network to find a mapping from the parameters of a model to the evolution of observables in random quantum circuits.
We show that a particular type of recurrent neural network is extremely powerful in generalizing its predictions within the system size and time window that it has been trained on for both, localized and scrambled regimes.
arXiv Detail & Related papers (2023-02-09T13:14:10Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Analysis of Neural Network Predictions for Entanglement Self-Catalysis [0.0]
We investigate whether distinct models of neural networks can learn how to detect and self-catalysis of entanglement.
We also study whether a trained machine can detect another related phenomenon.
arXiv Detail & Related papers (2021-12-29T14:18:45Z) - The Hintons in your Neural Network: a Quantum Field Theory View of Deep
Learning [84.33745072274942]
We show how to represent linear and non-linear layers as unitary quantum gates, and interpret the fundamental excitations of the quantum model as particles.
On top of opening a new perspective and techniques for studying neural networks, the quantum formulation is well suited for optical quantum computing.
arXiv Detail & Related papers (2021-03-08T17:24:29Z) - Quantum neural networks with deep residual learning [29.929891641757273]
In this paper, a novel quantum neural network with deep residual learning (ResQNN) is proposed.
Our ResQNN is able to learn an unknown unitary and get remarkable performance.
arXiv Detail & Related papers (2020-12-14T18:11:07Z) - Reservoir Memory Machines as Neural Computers [70.5993855765376]
Differentiable neural computers extend artificial neural networks with an explicit memory without interference.
We achieve some of the computational capabilities of differentiable neural computers with a model that can be trained very efficiently.
arXiv Detail & Related papers (2020-09-14T12:01:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.