Enhancing Graph Neural Networks with Quantum Computed Encodings
- URL: http://arxiv.org/abs/2310.20519v1
- Date: Tue, 31 Oct 2023 14:56:52 GMT
- Title: Enhancing Graph Neural Networks with Quantum Computed Encodings
- Authors: Slimane Thabet, Romain Fouilland, Mehdi Djellabi, Igor Sokolov, Sachin
Kasture, Louis-Paul Henry, Lo\"ic Henriet
- Abstract summary: We propose novel families of positional encodings tailored for graph transformers.
These encodings leverage the long-range correlations inherent in quantum systems.
We show that the performance of state-of-the-art models can be improved on standard benchmarks and large-scale datasets.
- Score: 1.884651553431727
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Transformers are increasingly employed for graph data, demonstrating
competitive performance in diverse tasks. To incorporate graph information into
these models, it is essential to enhance node and edge features with positional
encodings. In this work, we propose novel families of positional encodings
tailored for graph transformers. These encodings leverage the long-range
correlations inherent in quantum systems, which arise from mapping the topology
of a graph onto interactions between qubits in a quantum computer. Our
inspiration stems from the recent advancements in quantum processing units,
which offer computational capabilities beyond the reach of classical hardware.
We prove that some of these quantum features are theoretically more expressive
for certain graphs than the commonly used relative random walk probabilities.
Empirically, we show that the performance of state-of-the-art models can be
improved on standard benchmarks and large-scale datasets by computing tractable
versions of quantum features. Our findings highlight the potential of
leveraging quantum computing capabilities to potentially enhance the
performance of transformers in handling graph data.
Related papers
- Tensor-Based Binary Graph Encoding for Variational Quantum Classifiers [3.5051814539447474]
We propose a novel quantum encoding framework for graph classification using Variational Quantums (VQCs)
By constructing slightly more complex circuits tailored for graph encoding, we demonstrate that VQCs can effectively classify graphs within the constraints of current quantum hardware.
arXiv Detail & Related papers (2025-01-24T02:26:21Z) - SeQUeNCe GUI: An Extensible User Interface for Discrete Event Quantum Network Simulations [55.2480439325792]
SeQUeNCe is an open source simulator of quantum network communication.
We implement a graphical user interface which maintains the core principles of SeQUeNCe.
arXiv Detail & Related papers (2025-01-15T19:36:09Z) - GQWformer: A Quantum-based Transformer for Graph Representation Learning [15.97445757658235]
We propose a novel approach that integrate graph inductive bias into self-attention mechanisms by leveraging quantum technology for structural encoding.
We introduce the Graph Quantum Walk Transformer (GQWformer), a groundbreaking GNN framework that utilizes quantum walks on attributed graphs to generate node quantum states.
These quantum states encapsulate rich structural attributes and serve as inductive biases for the transformer, thereby enabling the generation of more meaningful attention scores.
arXiv Detail & Related papers (2024-12-03T09:03:04Z) - Quantum Positional Encodings for Graph Neural Networks [1.9791587637442671]
We propose novel families of positional encodings tailored to graph neural networks obtained with quantum computers.
Our inspiration stems from the recent advancements in quantum processing units, which offer computational capabilities beyond the reach of classical hardware.
arXiv Detail & Related papers (2024-05-21T17:56:33Z) - Graph Transformers without Positional Encodings [0.7252027234425334]
We introduce Eigenformer, a Graph Transformer employing a novel spectrum-aware attention mechanism cognizant of the Laplacian spectrum of the graph.
We empirically show that it achieves performance competetive with SOTA Graph Transformers on a number of standard GNN benchmarks.
arXiv Detail & Related papers (2024-01-31T12:33:31Z) - Deep Prompt Tuning for Graph Transformers [55.2480439325792]
Fine-tuning is resource-intensive and requires storing multiple copies of large models.
We propose a novel approach called deep graph prompt tuning as an alternative to fine-tuning.
By freezing the pre-trained parameters and only updating the added tokens, our approach reduces the number of free parameters and eliminates the need for multiple model copies.
arXiv Detail & Related papers (2023-09-18T20:12:17Z) - QuanGCN: Noise-Adaptive Training for Robust Quantum Graph Convolutional
Networks [124.7972093110732]
We propose quantum graph convolutional networks (QuanGCN), which learns the local message passing among nodes with the sequence of crossing-gate quantum operations.
To mitigate the inherent noises from modern quantum devices, we apply sparse constraint to sparsify the nodes' connections.
Our QuanGCN is functionally comparable or even superior than the classical algorithms on several benchmark graph datasets.
arXiv Detail & Related papers (2022-11-09T21:43:16Z) - Extending Graph Transformers with Quantum Computed Aggregation [0.0]
We introduce a GNN architecture where the aggregation weights are computed using the long-range correlations of a quantum system.
These correlations are generated by translating the graph topology into the interactions of a set of qubits in a quantum computer.
arXiv Detail & Related papers (2022-10-19T14:56:15Z) - From Quantum Graph Computing to Quantum Graph Learning: A Survey [86.8206129053725]
We first elaborate the correlations between quantum mechanics and graph theory to show that quantum computers are able to generate useful solutions.
For its practicability and wide-applicability, we give a brief review of typical graph learning techniques.
We give a snapshot of quantum graph learning where expectations serve as a catalyst for subsequent research.
arXiv Detail & Related papers (2022-02-19T02:56:47Z) - Post-Training Quantization for Vision Transformer [85.57953732941101]
We present an effective post-training quantization algorithm for reducing the memory storage and computational costs of vision transformers.
We can obtain an 81.29% top-1 accuracy using DeiT-B model on ImageNet dataset with about 8-bit quantization.
arXiv Detail & Related papers (2021-06-27T06:27:22Z) - Generation of High-Resolution Handwritten Digits with an Ion-Trap
Quantum Computer [55.41644538483948]
We implement a quantum-circuit based generative model to learn and sample the prior distribution of a Generative Adversarial Network.
We train this hybrid algorithm on an ion-trap device based on $171$Yb$+$ ion qubits to generate high-quality images.
arXiv Detail & Related papers (2020-12-07T18:51:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.