Towards Quantum Graph Neural Networks: An Ego-Graph Learning Approach
- URL: http://arxiv.org/abs/2201.05158v3
- Date: Fri, 19 Jan 2024 16:26:46 GMT
- Title: Towards Quantum Graph Neural Networks: An Ego-Graph Learning Approach
- Authors: Xing Ai, Zhihong Zhang, Luzhe Sun, Junchi Yan, Edwin Hancock
- Abstract summary: We propose a novel hybrid quantum-classical algorithm for graph-structured data, which we refer to as the Ego-graph based Quantum Graph Neural Network (egoQGNN)
egoQGNN implements the GNN theoretical framework using the tensor product and unity matrix representation, which greatly reduces the number of model parameters required.
The architecture is based on a novel mapping from real-world data to Hilbert space.
- Score: 47.19265172105025
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Quantum machine learning is a fast-emerging field that aims to tackle machine
learning using quantum algorithms and quantum computing. Due to the lack of
physical qubits and an effective means to map real-world data from Euclidean
space to Hilbert space, most of these methods focus on quantum analogies or
process simulations rather than devising concrete architectures based on
qubits. In this paper, we propose a novel hybrid quantum-classical algorithm
for graph-structured data, which we refer to as the Ego-graph based Quantum
Graph Neural Network (egoQGNN). egoQGNN implements the GNN theoretical
framework using the tensor product and unity matrix representation, which
greatly reduces the number of model parameters required. When controlled by a
classical computer, egoQGNN can accommodate arbitrarily sized graphs by
processing ego-graphs from the input graph using a modestly-sized quantum
device. The architecture is based on a novel mapping from real-world data to
Hilbert space. This mapping maintains the distance relations present in the
data and reduces information loss. Experimental results show that the proposed
method outperforms competitive state-of-the-art models with only 1.68\%
parameters compared to those models.
Related papers
- Graph Neural Networks on Quantum Computers [3.8784640343151184]
Graph Neural Networks (GNNs) are powerful machine learning models that excel at analyzing structured data represented as graphs.
This paper proposes frameworks for implementing GNNs on quantum computers to potentially address the challenges.
arXiv Detail & Related papers (2024-05-27T11:31:08Z) - Learnability of a hybrid quantum-classical neural network for graph-structured quantum data [0.0]
We build a hybrid quantum-classical neural network with deep residual learning (Res-HQCNN) with graph-structured quantum data.
We show that the using of information about graph structures in quantum data can lead to better learning efficiency compared with the state-of-the-art model.
arXiv Detail & Related papers (2024-01-28T14:06:06Z) - A Comparison Between Invariant and Equivariant Classical and Quantum Graph Neural Networks [3.350407101925898]
Deep geometric methods, such as graph neural networks (GNNs), have been leveraged for various data analysis tasks in high-energy physics.
One typical task is jet tagging, where jets are viewed as point clouds with distinct features and edge connections between their constituent particles.
In this paper, we perform a fair and comprehensive comparison between classical graph neural networks (GNNs) and their quantum counterparts.
arXiv Detail & Related papers (2023-11-30T16:19:13Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - QuanGCN: Noise-Adaptive Training for Robust Quantum Graph Convolutional
Networks [124.7972093110732]
We propose quantum graph convolutional networks (QuanGCN), which learns the local message passing among nodes with the sequence of crossing-gate quantum operations.
To mitigate the inherent noises from modern quantum devices, we apply sparse constraint to sparsify the nodes' connections.
Our QuanGCN is functionally comparable or even superior than the classical algorithms on several benchmark graph datasets.
arXiv Detail & Related papers (2022-11-09T21:43:16Z) - Extending Graph Transformers with Quantum Computed Aggregation [0.0]
We introduce a GNN architecture where the aggregation weights are computed using the long-range correlations of a quantum system.
These correlations are generated by translating the graph topology into the interactions of a set of qubits in a quantum computer.
arXiv Detail & Related papers (2022-10-19T14:56:15Z) - A Comprehensive Study on Large-Scale Graph Training: Benchmarking and
Rethinking [124.21408098724551]
Large-scale graph training is a notoriously challenging problem for graph neural networks (GNNs)
We present a new ensembling training manner, named EnGCN, to address the existing issues.
Our proposed method has achieved new state-of-the-art (SOTA) performance on large-scale datasets.
arXiv Detail & Related papers (2022-10-14T03:43:05Z) - A quantum algorithm for training wide and deep classical neural networks [72.2614468437919]
We show that conditions amenable to classical trainability via gradient descent coincide with those necessary for efficiently solving quantum linear systems.
We numerically demonstrate that the MNIST image dataset satisfies such conditions.
We provide empirical evidence for $O(log n)$ training of a convolutional neural network with pooling.
arXiv Detail & Related papers (2021-07-19T23:41:03Z) - Exploring Quantum Perceptron and Quantum Neural Network structures with
a teacher-student scheme [0.0]
Near-term quantum devices can be used to build quantum machine learning models, such as quantum kernel methods and quantum neural networks (QNN) to perform classification tasks.
The aim of this work is to systematically compare different QNN architectures and to evaluate their relative expressive power with a teacher-student scheme.
We focus particularly on a quantum perceptron model inspired by the recent work of Tacchino et. al. citeTacchino1 and compare it to the data re-uploading scheme that was originally introduced by P'erez-Salinas et. al. cite
arXiv Detail & Related papers (2021-05-04T13:13:52Z) - ForceNet: A Graph Neural Network for Large-Scale Quantum Calculations [86.41674945012369]
We develop a scalable and expressive Graph Neural Networks model, ForceNet, to approximate atomic forces.
Our proposed ForceNet is able to predict atomic forces more accurately than state-of-the-art physics-based GNNs.
arXiv Detail & Related papers (2021-03-02T03:09:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.