Realization of a quantum neural network using repeat-until-success
circuits in a superconducting quantum processor
- URL: http://arxiv.org/abs/2212.10742v1
- Date: Wed, 21 Dec 2022 03:26:32 GMT
- Title: Realization of a quantum neural network using repeat-until-success
circuits in a superconducting quantum processor
- Authors: M. S. Moreira, G. G. Guerreschi, W. Vlothuizen, J. F. Marques, J. van
Straten, S. P. Premaratne, X. Zou, H. Ali, N. Muthusubramanian, C.
Zachariadis, J. van Someren, M. Beekman, N. Haider, A. Bruno, C. G.
Almudever, A. Y. Matsuura, and L. DiCarlo
- Abstract summary: In this paper, we use repeat-until-success circuits enabled by real-time control-flow feedback to realize quantum neurons with non-linear activation functions.
As an example, we construct a minimal feedforward quantum neural network capable of learning all 2-to-1-bit Boolean functions.
This model is shown to perform non-linear classification and effectively learns from multiple copies of a single training state consisting of the maximal superposition of all inputs.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Artificial neural networks are becoming an integral part of digital solutions
to complex problems. However, employing neural networks on quantum processors
faces challenges related to the implementation of non-linear functions using
quantum circuits. In this paper, we use repeat-until-success circuits enabled
by real-time control-flow feedback to realize quantum neurons with non-linear
activation functions. These neurons constitute elementary building blocks that
can be arranged in a variety of layouts to carry out deep learning tasks
quantum coherently. As an example, we construct a minimal feedforward quantum
neural network capable of learning all 2-to-1-bit Boolean functions by
optimization of network activation parameters within the supervised-learning
paradigm. This model is shown to perform non-linear classification and
effectively learns from multiple copies of a single training state consisting
of the maximal superposition of all inputs.
Related papers
- Efficient Learning for Linear Properties of Bounded-Gate Quantum Circuits [63.733312560668274]
Given a quantum circuit containing d tunable RZ gates and G-d Clifford gates, can a learner perform purely classical inference to efficiently predict its linear properties?
We prove that the sample complexity scaling linearly in d is necessary and sufficient to achieve a small prediction error, while the corresponding computational complexity may scale exponentially in d.
We devise a kernel-based learning model capable of trading off prediction error and computational complexity, transitioning from exponential to scaling in many practical settings.
arXiv Detail & Related papers (2024-08-22T08:21:28Z) - Enhancing the expressivity of quantum neural networks with residual
connections [0.0]
We propose a quantum circuit-based algorithm to implement quantum residual neural networks (QResNets)
Our work lays the foundation for a complete quantum implementation of the classical residual neural networks.
arXiv Detail & Related papers (2024-01-29T04:00:51Z) - ShadowNet for Data-Centric Quantum System Learning [188.683909185536]
We propose a data-centric learning paradigm combining the strength of neural-network protocols and classical shadows.
Capitalizing on the generalization power of neural networks, this paradigm can be trained offline and excel at predicting previously unseen systems.
We present the instantiation of our paradigm in quantum state tomography and direct fidelity estimation tasks and conduct numerical analysis up to 60 qubits.
arXiv Detail & Related papers (2023-08-22T09:11:53Z) - Information-driven Nonlinear Quantum Neuron [0.0]
In this study, a hardware-efficient quantum neural network operating as an open quantum system is proposed.
We show that this dissipative model based on repeated interactions, which allows for easy parametrization of input quantum information, exhibits differentiable, non-linear activation functions.
arXiv Detail & Related papers (2023-07-18T07:12:08Z) - Parametrized constant-depth quantum neuron [56.51261027148046]
We propose a framework that builds quantum neurons based on kernel machines.
We present here a neuron that applies a tensor-product feature mapping to an exponentially larger space.
It turns out that parametrization allows the proposed neuron to optimally fit underlying patterns that the existing neuron cannot fit.
arXiv Detail & Related papers (2022-02-25T04:57:41Z) - Completely Quantum Neural Networks [0.0]
We describe how to embed and train a general neural network in a quantum annealer.
We develop three crucial ingredients: the free parameters of the network, the approximation of the activation function, and reduction of binary higher-order ingredients into ones.
We implement this for an elementary network and illustrate the advantages of quantum training: its consistency in finding the global minimum of the loss function and the fact that the network training converges in a single step.
arXiv Detail & Related papers (2022-02-23T19:00:03Z) - Quantum activation functions for quantum neural networks [0.0]
We show how to approximate any analytic function to any required accuracy without the need to measure the states encoding the information.
Our results recast the science of artificial neural networks in the architecture of gate-model quantum computers.
arXiv Detail & Related papers (2022-01-10T23:55:49Z) - A quantum algorithm for training wide and deep classical neural networks [72.2614468437919]
We show that conditions amenable to classical trainability via gradient descent coincide with those necessary for efficiently solving quantum linear systems.
We numerically demonstrate that the MNIST image dataset satisfies such conditions.
We provide empirical evidence for $O(log n)$ training of a convolutional neural network with pooling.
arXiv Detail & Related papers (2021-07-19T23:41:03Z) - Quantum Federated Learning with Quantum Data [87.49715898878858]
Quantum machine learning (QML) has emerged as a promising field that leans on the developments in quantum computing to explore large complex machine learning problems.
This paper proposes the first fully quantum federated learning framework that can operate over quantum data and, thus, share the learning of quantum circuit parameters in a decentralized manner.
arXiv Detail & Related papers (2021-05-30T12:19:27Z) - The Hintons in your Neural Network: a Quantum Field Theory View of Deep
Learning [84.33745072274942]
We show how to represent linear and non-linear layers as unitary quantum gates, and interpret the fundamental excitations of the quantum model as particles.
On top of opening a new perspective and techniques for studying neural networks, the quantum formulation is well suited for optical quantum computing.
arXiv Detail & Related papers (2021-03-08T17:24:29Z) - Variational learning for quantum artificial neural networks [0.0]
We first review a series of recent works describing the implementation of artificial neurons and feed-forward neural networks on quantum processors.
We then present an original realization of efficient individual quantum nodes based on variational unsampling protocols.
While keeping full compatibility with the overall memory-efficient feed-forward architecture, our constructions effectively reduce the quantum circuit depth required to determine the activation probability of single neurons.
arXiv Detail & Related papers (2021-03-03T16:10:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.