On Measuring Long-Range Interactions in Graph Neural Networks
- URL: http://arxiv.org/abs/2506.05971v1
- Date: Fri, 06 Jun 2025 10:48:30 GMT
- Title: On Measuring Long-Range Interactions in Graph Neural Networks
- Authors: Jacob Bamberger, Benjamin Gutteridge, Scott le Roux, Michael M. Bronstein, Xiaowen Dong,
- Abstract summary: Long-range graph tasks are an open problem in graph neural network research.<n>We introduce a range measure for operators on graphs, and validate it with synthetic experiments.
- Score: 24.974333602585368
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Long-range graph tasks -- those dependent on interactions between distant nodes -- are an open problem in graph neural network research. Real-world benchmark tasks, especially the Long Range Graph Benchmark, have become popular for validating the long-range capability of proposed architectures. However, this is an empirical approach that lacks both robustness and theoretical underpinning; a more principled characterization of the long-range problem is required. To bridge this gap, we formalize long-range interactions in graph tasks, introduce a range measure for operators on graphs, and validate it with synthetic experiments. We then leverage our measure to examine commonly used tasks and architectures, and discuss to what extent they are, in fact, long-range. We believe our work advances efforts to define and address the long-range problem on graphs, and that our range measure will aid evaluation of new datasets and architectures.
Related papers
- Towards Quantifying Long-Range Interactions in Graph Machine Learning: a Large Graph Dataset and a Measurement [10.124564216461858]
Long-range dependencies are critical for effective graph representation learning.<n>Most existing datasets focus on small graphs tailored to inductive tasks, offering limited insight into long-range interactions.<n>We introduce City-Networks, a novel large-scale transductive learning dataset derived from real-world city roads.
arXiv Detail & Related papers (2025-03-12T02:51:17Z) - Performance Heterogeneity in Graph Neural Networks: Lessons for Architecture Design and Preprocessing [1.1126342180866644]
Graph Neural Networks have emerged as the most popular architecture for graph-level learning.<n>We show that good performance in practice requires careful model design.<n>We propose a selective approach, which only targets graphs whose individual performance benefits from rewiring.
arXiv Detail & Related papers (2025-03-01T16:18:07Z) - Revisiting Graph Neural Networks on Graph-level Tasks: Comprehensive Experiments, Analysis, and Improvements [54.006506479865344]
We propose a unified evaluation framework for graph-level Graph Neural Networks (GNNs)<n>This framework provides a standardized setting to evaluate GNNs across diverse datasets.<n>We also propose a novel GNN model with enhanced expressivity and generalization capabilities.
arXiv Detail & Related papers (2025-01-01T08:48:53Z) - Long Range Propagation on Continuous-Time Dynamic Graphs [18.5534584418248]
Continuous-Time Graph Anti-Symmetric Network (CTAN) is designed for efficient propagation of information.
We show how CTAN's empirical performance on synthetic long-range benchmarks and real-world benchmarks is superior to other methods.
arXiv Detail & Related papers (2024-06-04T19:42:19Z) - Unsupervised Graph Neural Architecture Search with Disentangled
Self-supervision [51.88848982611515]
Unsupervised graph neural architecture search remains unexplored in the literature.
We propose a novel Disentangled Self-supervised Graph Neural Architecture Search model.
Our model is able to achieve state-of-the-art performance against several baseline methods in an unsupervised manner.
arXiv Detail & Related papers (2024-03-08T05:23:55Z) - Over-Squashing in Graph Neural Networks: A Comprehensive survey [0.0]
This survey delves into the challenge of over-squashing in Graph Neural Networks (GNNs)<n>It comprehensively explores the causes, consequences, and mitigation strategies for over-squashing.<n>Various methodologies are reviewed, including graph rewiring, novel normalization, spectral analysis, and curvature-based strategies.<n>The survey also discusses the interplay between over-squashing and other GNN limitations, such as over-smoothing.
arXiv Detail & Related papers (2023-08-29T18:46:15Z) - MGNNI: Multiscale Graph Neural Networks with Implicit Layers [53.75421430520501]
implicit graph neural networks (GNNs) have been proposed to capture long-range dependencies in underlying graphs.
We introduce and justify two weaknesses of implicit GNNs: the constrained expressiveness due to their limited effective range for capturing long-range dependencies, and their lack of ability to capture multiscale information on graphs at multiple resolutions.
We propose a multiscale graph neural network with implicit layers (MGNNI) which is able to model multiscale structures on graphs and has an expanded effective range for capturing long-range dependencies.
arXiv Detail & Related papers (2022-10-15T18:18:55Z) - Discovering the Representation Bottleneck of Graph Neural Networks from
Multi-order Interactions [51.597480162777074]
Graph neural networks (GNNs) rely on the message passing paradigm to propagate node features and build interactions.
Recent works point out that different graph learning tasks require different ranges of interactions between nodes.
We study two common graph construction methods in scientific domains, i.e., emphK-nearest neighbor (KNN) graphs and emphfully-connected (FC) graphs.
arXiv Detail & Related papers (2022-05-15T11:38:14Z) - Hierarchical graph neural nets can capture long-range interactions [8.067880298298185]
We study hierarchical message passing models that leverage a multi-resolution representation of a given graph.
This facilitates learning of features that span large receptive fields without loss of local information.
We introduce Hierarchical Graph Net (HGNet), which for any two connected nodes guarantees existence of message-passing paths of at most logarithmic length.
arXiv Detail & Related papers (2021-07-15T16:24:22Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z) - Cross-GCN: Enhancing Graph Convolutional Network with $k$-Order Feature
Interactions [153.6357310444093]
Graph Convolutional Network (GCN) is an emerging technique that performs learning and reasoning on graph data.
We argue that existing designs of GCN forgo modeling cross features, making GCN less effective for tasks or data where cross features are important.
We design a new operator named Cross-feature Graph Convolution, which explicitly models the arbitrary-order cross features with complexity linear to feature dimension and order size.
arXiv Detail & Related papers (2020-03-05T13:05:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.