Extending Graph Transformers with Quantum Computed Aggregation
- URL: http://arxiv.org/abs/2210.10610v1
- Date: Wed, 19 Oct 2022 14:56:15 GMT
- Title: Extending Graph Transformers with Quantum Computed Aggregation
- Authors: Slimane Thabet, Romain Fouilland, Loic Henriet
- Abstract summary: We introduce a GNN architecture where the aggregation weights are computed using the long-range correlations of a quantum system.
These correlations are generated by translating the graph topology into the interactions of a set of qubits in a quantum computer.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, efforts have been made in the community to design new Graph Neural
Networks (GNN), as limitations of Message Passing Neural Networks became more
apparent. This led to the appearance of Graph Transformers using global graph
features such as Laplacian Eigenmaps. In our paper, we introduce a GNN
architecture where the aggregation weights are computed using the long-range
correlations of a quantum system. These correlations are generated by
translating the graph topology into the interactions of a set of qubits in a
quantum computer. This work was inspired by the recent development of quantum
processing units which enables the computation of a new family of global graph
features that would be otherwise out of reach for classical hardware. We give
some theoretical insights about the potential benefits of this approach, and
benchmark our algorithm on standard datasets. Although not being adapted to all
datasets, our model performs similarly to standard GNN architectures, and paves
a promising future for quantum enhanced GNNs.
Related papers
- Spatio-Spectral Graph Neural Networks [50.277959544420455]
We propose Spatio-Spectral Graph Networks (S$2$GNNs)
S$2$GNNs combine spatially and spectrally parametrized graph filters.
We show that S$2$GNNs vanquish over-squashing and yield strictly tighter approximation-theoretic error bounds than MPGNNs.
arXiv Detail & Related papers (2024-05-29T14:28:08Z) - Graph Neural Networks on Quantum Computers [3.8784640343151184]
Graph Neural Networks (GNNs) are powerful machine learning models that excel at analyzing structured data represented as graphs.
This paper proposes frameworks for implementing GNNs on quantum computers to potentially address the challenges.
arXiv Detail & Related papers (2024-05-27T11:31:08Z) - Quantum Positional Encodings for Graph Neural Networks [1.9791587637442671]
We propose novel families of positional encodings tailored to graph neural networks obtained with quantum computers.
Our inspiration stems from the recent advancements in quantum processing units, which offer computational capabilities beyond the reach of classical hardware.
arXiv Detail & Related papers (2024-05-21T17:56:33Z) - A Comparison Between Invariant and Equivariant Classical and Quantum Graph Neural Networks [3.350407101925898]
Deep geometric methods, such as graph neural networks (GNNs), have been leveraged for various data analysis tasks in high-energy physics.
One typical task is jet tagging, where jets are viewed as point clouds with distinct features and edge connections between their constituent particles.
In this paper, we perform a fair and comprehensive comparison between classical graph neural networks (GNNs) and their quantum counterparts.
arXiv Detail & Related papers (2023-11-30T16:19:13Z) - Enhancing Graph Neural Networks with Quantum Computed Encodings [1.884651553431727]
We propose novel families of positional encodings tailored for graph transformers.
These encodings leverage the long-range correlations inherent in quantum systems.
We show that the performance of state-of-the-art models can be improved on standard benchmarks and large-scale datasets.
arXiv Detail & Related papers (2023-10-31T14:56:52Z) - QuanGCN: Noise-Adaptive Training for Robust Quantum Graph Convolutional
Networks [124.7972093110732]
We propose quantum graph convolutional networks (QuanGCN), which learns the local message passing among nodes with the sequence of crossing-gate quantum operations.
To mitigate the inherent noises from modern quantum devices, we apply sparse constraint to sparsify the nodes' connections.
Our QuanGCN is functionally comparable or even superior than the classical algorithms on several benchmark graph datasets.
arXiv Detail & Related papers (2022-11-09T21:43:16Z) - Towards Quantum Graph Neural Networks: An Ego-Graph Learning Approach [47.19265172105025]
We propose a novel hybrid quantum-classical algorithm for graph-structured data, which we refer to as the Ego-graph based Quantum Graph Neural Network (egoQGNN)
egoQGNN implements the GNN theoretical framework using the tensor product and unity matrix representation, which greatly reduces the number of model parameters required.
The architecture is based on a novel mapping from real-world data to Hilbert space.
arXiv Detail & Related papers (2022-01-13T16:35:45Z) - VQ-GNN: A Universal Framework to Scale up Graph Neural Networks using
Vector Quantization [70.8567058758375]
VQ-GNN is a universal framework to scale up any convolution-based GNNs using Vector Quantization (VQ) without compromising the performance.
Our framework avoids the "neighbor explosion" problem of GNNs using quantized representations combined with a low-rank version of the graph convolution matrix.
arXiv Detail & Related papers (2021-10-27T11:48:50Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Graph Neural Networks: Architectures, Stability and Transferability [176.3960927323358]
Graph Neural Networks (GNNs) are information processing architectures for signals supported on graphs.
They are generalizations of convolutional neural networks (CNNs) in which individual layers contain banks of graph convolutional filters.
arXiv Detail & Related papers (2020-08-04T18:57:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.