QDC: Quantum Diffusion Convolution Kernels on Graphs
- URL: http://arxiv.org/abs/2307.11234v1
- Date: Thu, 20 Jul 2023 21:10:54 GMT
- Title: QDC: Quantum Diffusion Convolution Kernels on Graphs
- Authors: Thomas Markovich
- Abstract summary: Graph convolutional neural networks (GCNs) operate by aggregating messages over local neighborhoods given a prediction task under interest.
We propose a new convolution kernel that effectively rewires the graph according to the occupation correlations of the vertices by trading on the generalized diffusion paradigm for the propagation of a quantum particle over the graph.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph convolutional neural networks (GCNs) operate by aggregating messages
over local neighborhoods given the prediction task under interest. Many GCNs
can be understood as a form of generalized diffusion of input features on the
graph, and significant work has been dedicated to improving predictive accuracy
by altering the ways of message passing. In this work, we propose a new
convolution kernel that effectively rewires the graph according to the
occupation correlations of the vertices by trading on the generalized diffusion
paradigm for the propagation of a quantum particle over the graph. We term this
new convolution kernel the Quantum Diffusion Convolution (QDC) operator. In
addition, we introduce a multiscale variant that combines messages from the QDC
operator and the traditional combinatorial Laplacian. To understand our method,
we explore the spectral dependence of homophily and the importance of quantum
dynamics in the construction of a bandpass filter. Through these studies, as
well as experiments on a range of datasets, we observe that QDC improves
predictive performance on the widely used benchmark datasets when compared to
similar methods.
Related papers
- Understanding the Effect of GCN Convolutions in Regression Tasks [8.299692647308323]
Graph Convolutional Networks (GCNs) have become a pivotal method in machine learning for modeling functions over graphs.
This paper provides a formal analysis of the impact of convolution operators on regression tasks over homophilic networks.
arXiv Detail & Related papers (2024-10-26T04:19:52Z) - Scalable Graph Compressed Convolutions [68.85227170390864]
We propose a differentiable method that applies permutations to calibrate input graphs for Euclidean convolution.
Based on the graph calibration, we propose the Compressed Convolution Network (CoCN) for hierarchical graph representation learning.
arXiv Detail & Related papers (2024-07-26T03:14:13Z) - GCEPNet: Graph Convolution-Enhanced Expectation Propagation for Massive MIMO Detection [5.714553194279462]
We show that a real-valued system can be modeled as spectral signal convolution on graph, through which the correlation between unknown variables can be captured.
Based on such analysis, we propose graph convolution-enhanced expectation propagation (GCEPNet) with better generalization capacity.
arXiv Detail & Related papers (2024-04-23T10:13:39Z) - Revealing Decurve Flows for Generalized Graph Propagation [108.80758541147418]
This study addresses the limitations of the traditional analysis of message-passing, central to graph learning, by defining em textbfgeneralized propagation with directed and weighted graphs.
We include a preliminary exploration of learned propagation patterns in datasets, a first in the field.
arXiv Detail & Related papers (2024-02-13T14:13:17Z) - Enhancing Graph Neural Networks with Quantum Computed Encodings [1.884651553431727]
We propose novel families of positional encodings tailored for graph transformers.
These encodings leverage the long-range correlations inherent in quantum systems.
We show that the performance of state-of-the-art models can be improved on standard benchmarks and large-scale datasets.
arXiv Detail & Related papers (2023-10-31T14:56:52Z) - Neural Tangent Kernels Motivate Graph Neural Networks with
Cross-Covariance Graphs [94.44374472696272]
We investigate NTKs and alignment in the context of graph neural networks (GNNs)
Our results establish the theoretical guarantees on the optimality of the alignment for a two-layer GNN.
These guarantees are characterized by the graph shift operator being a function of the cross-covariance between the input and the output data.
arXiv Detail & Related papers (2023-10-16T19:54:21Z) - Advective Diffusion Transformers for Topological Generalization in Graph
Learning [69.2894350228753]
We show how graph diffusion equations extrapolate and generalize in the presence of varying graph topologies.
We propose a novel graph encoder backbone, Advective Diffusion Transformer (ADiT), inspired by advective graph diffusion equations.
arXiv Detail & Related papers (2023-10-10T08:40:47Z) - Geometric Graph Filters and Neural Networks: Limit Properties and
Discriminability Trade-offs [122.06927400759021]
We study the relationship between a graph neural network (GNN) and a manifold neural network (MNN) when the graph is constructed from a set of points sampled from the manifold.
We prove non-asymptotic error bounds showing that convolutional filters and neural networks on these graphs converge to convolutional filters and neural networks on the continuous manifold.
arXiv Detail & Related papers (2023-05-29T08:27:17Z) - Tangent Bundle Convolutional Learning: from Manifolds to Cellular Sheaves and Back [84.61160272624262]
We define tangent bundle filters and tangent bundle neural networks (TNNs) based on this convolution operation.
Tangent bundle filters admit a spectral representation that generalizes the ones of scalar manifold filters, graph filters and standard convolutional filters in continuous time.
We numerically evaluate the effectiveness of the proposed architecture on various learning tasks.
arXiv Detail & Related papers (2023-03-20T17:57:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.