Theoretical Guarantees for Permutation-Equivariant Quantum Neural
Networks
- URL: http://arxiv.org/abs/2210.09974v3
- Date: Wed, 14 Feb 2024 00:03:32 GMT
- Title: Theoretical Guarantees for Permutation-Equivariant Quantum Neural
Networks
- Authors: Louis Schatzki, Martin Larocca, Quynh T. Nguyen, Frederic Sauvage, M.
Cerezo
- Abstract summary: We show how to build equivariant quantum neural networks (QNNs)
We prove that they do not suffer from barren plateaus, quickly reach overparametrization, and generalize well from small amounts of data.
Our work provides the first theoretical guarantees for equivariant QNNs, thus indicating the extreme power and potential of GQML.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Despite the great promise of quantum machine learning models, there are
several challenges one must overcome before unlocking their full potential. For
instance, models based on quantum neural networks (QNNs) can suffer from
excessive local minima and barren plateaus in their training landscapes.
Recently, the nascent field of geometric quantum machine learning (GQML) has
emerged as a potential solution to some of those issues. The key insight of
GQML is that one should design architectures, such as equivariant QNNs,
encoding the symmetries of the problem at hand. Here, we focus on problems with
permutation symmetry (i.e., the group of symmetry $S_n$), and show how to build
$S_n$-equivariant QNNs. We provide an analytical study of their performance,
proving that they do not suffer from barren plateaus, quickly reach
overparametrization, and generalize well from small amounts of data. To verify
our results, we perform numerical simulations for a graph state classification
task. Our work provides the first theoretical guarantees for equivariant QNNs,
thus indicating the extreme power and potential of GQML.
Related papers
- Permutation-equivariant quantum convolutional neural networks [1.7034813545878589]
We propose the architectures of equivariant quantum convolutional neural networks (EQCNNs) adherent to $S_n$ and its subgroups.
For subgroups of $S_n$, our numerical results using MNIST datasets show better classification accuracy than non-equivariant QCNNs.
arXiv Detail & Related papers (2024-04-28T14:34:28Z) - Projected Stochastic Gradient Descent with Quantum Annealed Binary Gradients [51.82488018573326]
We present QP-SBGD, a novel layer-wise optimiser tailored towards training neural networks with binary weights.
BNNs reduce the computational requirements and energy consumption of deep learning models with minimal loss in accuracy.
Our algorithm is implemented layer-wise, making it suitable to train larger networks on resource-limited quantum hardware.
arXiv Detail & Related papers (2023-10-23T17:32:38Z) - Approximately Equivariant Quantum Neural Network for $p4m$ Group
Symmetries in Images [30.01160824817612]
This work proposes equivariant Quantum Convolutional Neural Networks (EquivQCNNs) for image classification under planar $p4m$ symmetry.
We present the results tested in different use cases, such as phase detection of the 2D Ising model and classification of the extended MNIST dataset.
arXiv Detail & Related papers (2023-10-03T18:01:02Z) - QNEAT: Natural Evolution of Variational Quantum Circuit Architecture [95.29334926638462]
We focus on variational quantum circuits (VQC), which emerged as the most promising candidates for the quantum counterpart of neural networks.
Although showing promising results, VQCs can be hard to train because of different issues, e.g., barren plateau, periodicity of the weights, or choice of architecture.
We propose a gradient-free algorithm inspired by natural evolution to optimize both the weights and the architecture of the VQC.
arXiv Detail & Related papers (2023-04-14T08:03:20Z) - Towards Neural Variational Monte Carlo That Scales Linearly with System
Size [67.09349921751341]
Quantum many-body problems are central to demystifying some exotic quantum phenomena, e.g., high-temperature superconductors.
The combination of neural networks (NN) for representing quantum states, and the Variational Monte Carlo (VMC) algorithm, has been shown to be a promising method for solving such problems.
We propose a NN architecture called Vector-Quantized Neural Quantum States (VQ-NQS) that utilizes vector-quantization techniques to leverage redundancies in the local-energy calculations of the VMC algorithm.
arXiv Detail & Related papers (2022-12-21T19:00:04Z) - Theory for Equivariant Quantum Neural Networks [0.0]
We present a theoretical framework to design equivariant quantum neural networks (EQNNs) for essentially any relevant symmetry group.
Our framework can be readily applied to virtually all areas of quantum machine learning.
arXiv Detail & Related papers (2022-10-16T15:42:21Z) - GHN-Q: Parameter Prediction for Unseen Quantized Convolutional
Architectures via Graph Hypernetworks [80.29667394618625]
We conduct the first-ever study exploring the use of graph hypernetworks for predicting parameters of unseen quantized CNN architectures.
We focus on a reduced CNN search space and find that GHN-Q can in fact predict quantization-robust parameters for various 8-bit quantized CNNs.
arXiv Detail & Related papers (2022-08-26T08:00:02Z) - Theory of overparametrization in quantum neural networks [0.0]
We rigorously analyze the overparametrization phenomenon in Quantum Neural Networks (QNNs) with periodic structure.
Our results show that the dimension of the Lie algebra obtained from the generators of the QNN is an upper bound for $M_c$.
We then connect the notion of overparametrization to the QNN capacity, so that when a QNN is overparametrized, its capacity achieves its maximum possible value.
arXiv Detail & Related papers (2021-09-23T22:39:48Z) - Toward Trainability of Quantum Neural Networks [87.04438831673063]
Quantum Neural Networks (QNNs) have been proposed as generalizations of classical neural networks to achieve the quantum speed-up.
Serious bottlenecks exist for training QNNs due to the vanishing with gradient rate exponential to the input qubit number.
We show that QNNs with tree tensor and step controlled structures for the application of binary classification. Simulations show faster convergent rates and better accuracy compared to QNNs with random structures.
arXiv Detail & Related papers (2020-11-12T08:32:04Z) - On the learnability of quantum neural networks [132.1981461292324]
We consider the learnability of the quantum neural network (QNN) built on the variational hybrid quantum-classical scheme.
We show that if a concept can be efficiently learned by QNN, then it can also be effectively learned by QNN even with gate noise.
arXiv Detail & Related papers (2020-07-24T06:34:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.