Training embedding quantum kernels with data re-uploading quantum neural
networks
- URL: http://arxiv.org/abs/2401.04642v1
- Date: Tue, 9 Jan 2024 16:08:32 GMT
- Title: Training embedding quantum kernels with data re-uploading quantum neural
networks
- Authors: Pablo Rodriguez-Grasa, Yue Ban, Mikel Sanz
- Abstract summary: Kernel methods play a crucial role in machine learning and the Embedding Quantum Kernels (EQKs) have shown very promising performance.
We propose a $p$-qubit Quantum Neural Network (QNN) based on data re-uploading to identify the optimal $q$-qubit EQK for a task.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Kernel methods play a crucial role in machine learning and the Embedding
Quantum Kernels (EQKs), an extension to quantum systems, have shown very
promising performance. However, choosing the right embedding for EQKs is
challenging. We address this by proposing a $p$-qubit Quantum Neural Network
(QNN) based on data re-uploading to identify the optimal $q$-qubit EQK for a
task ($p$-to-$q$). This method requires constructing the kernel matrix only
once, offering improved efficiency. In particular, we focus on two cases:
$n$-to-$n$, where we propose a scalable approach to train an $n$-qubit QNN, and
$1$-to-$n$, demonstrating that the training of a single-qubit QNN can be
leveraged to construct powerful EQKs.
Related papers
- Satellite image classification with neural quantum kernels [0.0699049312989311]
We use quantum kernels to classify images which include solar panels.
In the latter, we iteratively train an $n$-qubit QNN to ensure scalability, using the resultant architecture to directly form an $n$-qubit EQK.
Results are robust against a suboptimal training of the QNN.
arXiv Detail & Related papers (2024-09-30T14:52:00Z) - Projected Stochastic Gradient Descent with Quantum Annealed Binary Gradients [51.82488018573326]
We present QP-SBGD, a novel layer-wise optimiser tailored towards training neural networks with binary weights.
BNNs reduce the computational requirements and energy consumption of deep learning models with minimal loss in accuracy.
Our algorithm is implemented layer-wise, making it suitable to train larger networks on resource-limited quantum hardware.
arXiv Detail & Related papers (2023-10-23T17:32:38Z) - Quantum Imitation Learning [74.15588381240795]
We propose quantum imitation learning (QIL) with a hope to utilize quantum advantage to speed up IL.
We develop two QIL algorithms, quantum behavioural cloning (Q-BC) and quantum generative adversarial imitation learning (Q-GAIL)
Experiment results demonstrate that both Q-BC and Q-GAIL can achieve comparable performance compared to classical counterparts.
arXiv Detail & Related papers (2023-04-04T12:47:35Z) - Accelerating the training of single-layer binary neural networks using
the HHL quantum algorithm [58.720142291102135]
We show that useful information can be extracted from the quantum-mechanical implementation of Harrow-Hassidim-Lloyd (HHL)
This paper shows, however, that useful information can be extracted from the quantum-mechanical implementation of HHL, and used to reduce the complexity of finding the solution on the classical side.
arXiv Detail & Related papers (2022-10-23T11:58:05Z) - On-chip QNN: Towards Efficient On-Chip Training of Quantum Neural
Networks [21.833693982056896]
We present On-chip QNN, the first experimental demonstration of practical on-chip QNN training with parameter shift.
We propose probabilistic gradient pruning to firstly identify gradients with potentially large errors and then remove them.
The results demonstrate that our on-chip training achieves over 90% and 60% accuracy for 2-class and 4-class image classification tasks.
arXiv Detail & Related papers (2022-02-26T22:27:36Z) - A quantum algorithm for training wide and deep classical neural networks [72.2614468437919]
We show that conditions amenable to classical trainability via gradient descent coincide with those necessary for efficiently solving quantum linear systems.
We numerically demonstrate that the MNIST image dataset satisfies such conditions.
We provide empirical evidence for $O(log n)$ training of a convolutional neural network with pooling.
arXiv Detail & Related papers (2021-07-19T23:41:03Z) - A Quantum Convolutional Neural Network on NISQ Devices [0.9831489366502298]
We propose a quantum convolutional neural network inspired by convolutional neural networks.
Our model is robust to certain noise for image recognition tasks.
It opens up the prospect of exploiting quantum power to process information in the era of big data.
arXiv Detail & Related papers (2021-04-14T15:07:03Z) - Hybrid quantum-classical classifier based on tensor network and
variational quantum circuit [0.0]
We introduce a hybrid model combining the quantum-inspired tensor networks (TN) and the variational quantum circuits (VQC) to perform supervised learning tasks.
We show that a matrix product state based TN with low bond dimensions performs better than PCA as a feature extractor to compress data for the input of VQCs in the binary classification of MNIST dataset.
arXiv Detail & Related papers (2020-11-30T09:43:59Z) - Toward Trainability of Quantum Neural Networks [87.04438831673063]
Quantum Neural Networks (QNNs) have been proposed as generalizations of classical neural networks to achieve the quantum speed-up.
Serious bottlenecks exist for training QNNs due to the vanishing with gradient rate exponential to the input qubit number.
We show that QNNs with tree tensor and step controlled structures for the application of binary classification. Simulations show faster convergent rates and better accuracy compared to QNNs with random structures.
arXiv Detail & Related papers (2020-11-12T08:32:04Z) - On the learnability of quantum neural networks [132.1981461292324]
We consider the learnability of the quantum neural network (QNN) built on the variational hybrid quantum-classical scheme.
We show that if a concept can be efficiently learned by QNN, then it can also be effectively learned by QNN even with gate noise.
arXiv Detail & Related papers (2020-07-24T06:34:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.