Inductive Graph Representation Learning with Quantum Graph Neural Networks
- URL: http://arxiv.org/abs/2503.24111v1
- Date: Mon, 31 Mar 2025 14:04:08 GMT
- Title: Inductive Graph Representation Learning with Quantum Graph Neural Networks
- Authors: Arthur M. Faria, Ignacio F. GraƱa, Savvas Varsamopoulos,
- Abstract summary: Quantum Graph Neural Networks (QGNNs) present a promising approach for combining quantum computing with graph-structured data processing.<n>We propose a versatile QGNN framework inspired by the classical GraphSAGE approach, utilizing quantum models as aggregators.<n>We show that our quantum approach exhibits robust generalization across molecules with varying numbers of atoms without requiring circuit modifications.
- Score: 0.40964539027092917
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Quantum Graph Neural Networks (QGNNs) present a promising approach for combining quantum computing with graph-structured data processing. While classical Graph Neural Networks (GNNs) are renowned for their scalability and robustness, existing QGNNs often lack flexibility due to graph-specific quantum circuit designs, limiting their applicability to a narrower range of graph-structured problems, falling short of real-world scenarios. To address these limitations, we propose a versatile QGNN framework inspired by the classical GraphSAGE approach, utilizing quantum models as aggregators. In this work, we integrate established techniques for inductive representation learning on graphs with parametrized quantum convolutional and pooling layers, effectively bridging classical and quantum paradigms. The convolutional layer is flexible, enabling tailored designs for specific problems. Benchmarked on a node regression task with the QM9 dataset, we demonstrate that our framework successfully models a non-trivial molecular dataset, achieving performance comparable to classical GNNs. In particular, we show that our quantum approach exhibits robust generalization across molecules with varying numbers of atoms without requiring circuit modifications, slightly outperforming classical GNNs. Furthermore, we numerically investigate the scalability of the QGNN framework. Specifically, we demonstrate the absence of barren plateaus in our architecture as the number of qubits increases, suggesting that the proposed quantum model can be extended to handle larger and more complex graph-based problems effectively.
Related papers
- QGHNN: A quantum graph Hamiltonian neural network [30.632260870411177]
Graph Neural Networks (GNNs) strive to address the challenges posed by complex, high-dimensional graph data.<n>Quantum Neural Networks (QNNs) present a compelling alternative due to their potential for quantum parallelism.<n>This paper introduces a quantum graph Hamiltonian neural network (QGHNN) to enhance graph representation and learning on noisy intermediate-scale quantum computers.
arXiv Detail & Related papers (2025-01-14T10:15:17Z) - From Graphs to Qubits: A Critical Review of Quantum Graph Neural Networks [56.51893966016221]
Quantum Graph Neural Networks (QGNNs) represent a novel fusion of quantum computing and Graph Neural Networks (GNNs)
This paper critically reviews the state-of-the-art in QGNNs, exploring various architectures.
We discuss their applications across diverse fields such as high-energy physics, molecular chemistry, finance and earth sciences, highlighting the potential for quantum advantage.
arXiv Detail & Related papers (2024-08-12T22:53:14Z) - Graph Neural Networks on Quantum Computers [3.8784640343151184]
Graph Neural Networks (GNNs) are powerful machine learning models that excel at analyzing structured data represented as graphs.
This paper proposes frameworks for implementing GNNs on quantum computers to potentially address the challenges.
arXiv Detail & Related papers (2024-05-27T11:31:08Z) - Jet Discrimination with Quantum Complete Graph Neural Network [1.684646794156297]
We propose the Quantum Complete Graph Neural Network (QCGNN), which is a variational quantum algorithm based on complete graphs.
We investigate the application of QCGNN with the challenging task of jet discrimination, where the jets are represented as complete graphs.
arXiv Detail & Related papers (2024-03-08T02:02:23Z) - QuanGCN: Noise-Adaptive Training for Robust Quantum Graph Convolutional
Networks [124.7972093110732]
We propose quantum graph convolutional networks (QuanGCN), which learns the local message passing among nodes with the sequence of crossing-gate quantum operations.
To mitigate the inherent noises from modern quantum devices, we apply sparse constraint to sparsify the nodes' connections.
Our QuanGCN is functionally comparable or even superior than the classical algorithms on several benchmark graph datasets.
arXiv Detail & Related papers (2022-11-09T21:43:16Z) - Extending Graph Transformers with Quantum Computed Aggregation [0.0]
We introduce a GNN architecture where the aggregation weights are computed using the long-range correlations of a quantum system.
These correlations are generated by translating the graph topology into the interactions of a set of qubits in a quantum computer.
arXiv Detail & Related papers (2022-10-19T14:56:15Z) - EvenNet: Ignoring Odd-Hop Neighbors Improves Robustness of Graph Neural
Networks [51.42338058718487]
Graph Neural Networks (GNNs) have received extensive research attention for their promising performance in graph machine learning.
Existing approaches, such as GCN and GPRGNN, are not robust in the face of homophily changes on test graphs.
We propose EvenNet, a spectral GNN corresponding to an even-polynomial graph filter.
arXiv Detail & Related papers (2022-05-27T10:48:14Z) - Towards Quantum Graph Neural Networks: An Ego-Graph Learning Approach [47.19265172105025]
We propose a novel hybrid quantum-classical algorithm for graph-structured data, which we refer to as the Ego-graph based Quantum Graph Neural Network (egoQGNN)
egoQGNN implements the GNN theoretical framework using the tensor product and unity matrix representation, which greatly reduces the number of model parameters required.
The architecture is based on a novel mapping from real-world data to Hilbert space.
arXiv Detail & Related papers (2022-01-13T16:35:45Z) - Toward Trainability of Quantum Neural Networks [87.04438831673063]
Quantum Neural Networks (QNNs) have been proposed as generalizations of classical neural networks to achieve the quantum speed-up.
Serious bottlenecks exist for training QNNs due to the vanishing with gradient rate exponential to the input qubit number.
We show that QNNs with tree tensor and step controlled structures for the application of binary classification. Simulations show faster convergent rates and better accuracy compared to QNNs with random structures.
arXiv Detail & Related papers (2020-11-12T08:32:04Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Efficient Probabilistic Logic Reasoning with Graph Neural Networks [63.099999467118245]
Markov Logic Networks (MLNs) can be used to address many knowledge graph problems.
Inference in MLN is computationally intensive, making the industrial-scale application of MLN very difficult.
We propose a graph neural network (GNN) variant, named ExpressGNN, which strikes a nice balance between the representation power and the simplicity of the model.
arXiv Detail & Related papers (2020-01-29T23:34:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.