The role of data embedding in equivariant quantum convolutional neural
networks
- URL: http://arxiv.org/abs/2312.13250v2
- Date: Sat, 27 Jan 2024 23:15:21 GMT
- Title: The role of data embedding in equivariant quantum convolutional neural
networks
- Authors: Sreetama Das, Stefano Martina, Filippo Caruso
- Abstract summary: We investigate the role of classical-to-quantum embedding on the performance of equivariant quantum neural networks (EQNNs)
We numerically compare the classification accuracy of EQCNNs with three different basis-permuted amplitude embeddings to the one obtained from a non-equivariant quantum convolutional neural network (QCNN)
- Score: 2.255961793913651
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Geometric deep learning refers to the scenario in which the symmetries of a
dataset are used to constrain the parameter space of a neural network and thus,
improve their trainability and generalization. Recently this idea has been
incorporated into the field of quantum machine learning, which has given rise
to equivariant quantum neural networks (EQNNs). In this work, we investigate
the role of classical-to-quantum embedding on the performance of equivariant
quantum convolutional neural networks (EQCNNs) for the classification of
images. We discuss the connection between the data embedding method and the
resulting representation of a symmetry group and analyze how changing
representation affects the expressibility of an EQCNN. We numerically compare
the classification accuracy of EQCNNs with three different basis-permuted
amplitude embeddings to the one obtained from a non-equivariant quantum
convolutional neural network (QCNN). Our results show a clear dependence of
classification accuracy on the underlying embedding, especially for initial
training iterations. The improvement in classification accuracy of EQCNN over
non-equivariant QCNN may be present or absent depending on the particular
embedding and dataset used. It is expected that the results of this work can be
useful to the community for a better understanding of the importance of data
embedding choice in the context of geometric quantum machine learning.
Related papers
- Resource-efficient equivariant quantum convolutional neural networks [0.0]
This study proposes a resource-efficient model of equivariant quantum convolutional neural networks (QCNNs) called equivariant split-parallelizing QCNN (sp-QCNN)
Using a group-theoretical approach, we encode general symmetries into our model beyond the translational symmetry addressed by previous sp-QCNNs.
Our results contribute to the advancement of practical quantum machine learning algorithms.
arXiv Detail & Related papers (2024-10-02T05:51:33Z) - Parallel Proportional Fusion of Spiking Quantum Neural Network for Optimizing Image Classification [10.069224006497162]
We introduce a novel architecture termed Parallel Proportional Fusion of Quantum and Spiking Neural Networks (PPF-QSNN)
The proposed PPF-QSNN outperforms both the existing spiking neural network and the serial quantum neural network across metrics such as accuracy, loss, and robustness.
This study lays the groundwork for the advancement and application of quantum advantage in artificial intelligent computations.
arXiv Detail & Related papers (2024-04-01T10:35:35Z) - SO(2) and O(2) Equivariance in Image Recognition with
Bessel-Convolutional Neural Networks [63.24965775030674]
This work presents the development of Bessel-convolutional neural networks (B-CNNs)
B-CNNs exploit a particular decomposition based on Bessel functions to modify the key operation between images and filters.
Study is carried out to assess the performances of B-CNNs compared to other methods.
arXiv Detail & Related papers (2023-04-18T18:06:35Z) - Permutation Equivariant Neural Functionals [92.0667671999604]
This work studies the design of neural networks that can process the weights or gradients of other neural networks.
We focus on the permutation symmetries that arise in the weights of deep feedforward networks because hidden layer neurons have no inherent order.
In our experiments, we find that permutation equivariant neural functionals are effective on a diverse set of tasks.
arXiv Detail & Related papers (2023-02-27T18:52:38Z) - Problem-Dependent Power of Quantum Neural Networks on Multi-Class
Classification [83.20479832949069]
Quantum neural networks (QNNs) have become an important tool for understanding the physical world, but their advantages and limitations are not fully understood.
Here we investigate the problem-dependent power of QCs on multi-class classification tasks.
Our work sheds light on the problem-dependent power of QNNs and offers a practical tool for evaluating their potential merit.
arXiv Detail & Related papers (2022-12-29T10:46:40Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - Theory for Equivariant Quantum Neural Networks [0.0]
We present a theoretical framework to design equivariant quantum neural networks (EQNNs) for essentially any relevant symmetry group.
Our framework can be readily applied to virtually all areas of quantum machine learning.
arXiv Detail & Related papers (2022-10-16T15:42:21Z) - Interrelation of equivariant Gaussian processes and convolutional neural
networks [77.34726150561087]
Currently there exists rather promising new trend in machine leaning (ML) based on the relationship between neural networks (NN) and Gaussian processes (GP)
In this work we establish a relationship between the many-channel limit for CNNs equivariant with respect to two-dimensional Euclidean group with vector-valued neuron activations and the corresponding independently introduced equivariant Gaussian processes (GP)
arXiv Detail & Related papers (2022-09-17T17:02:35Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Branching Quantum Convolutional Neural Networks [0.0]
Small-scale quantum computers are already showing potential gains in learning tasks on large quantum and very large classical data sets.
We present a generalization of QCNN, the branching quantum convolutional neural network, or bQCNN, with substantially higher expressibility.
arXiv Detail & Related papers (2020-12-28T19:00:03Z) - Recurrent Quantum Neural Networks [7.6146285961466]
Recurrent neural networks are the foundation of many sequence-to-sequence models in machine learning.
We construct a quantum recurrent neural network (QRNN) with demonstrable performance on non-trivial tasks.
We evaluate the QRNN on MNIST classification, both by feeding the QRNN each image pixel-by-pixel; and by utilising modern data augmentation as preprocessing step.
arXiv Detail & Related papers (2020-06-25T17:59:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.