Search to Capture Long-range Dependency with Stacking GNNs for Graph
Classification
- URL: http://arxiv.org/abs/2302.08671v1
- Date: Fri, 17 Feb 2023 03:40:17 GMT
- Title: Search to Capture Long-range Dependency with Stacking GNNs for Graph
Classification
- Authors: Lanning Wei, Zhiqiang He, Huan Zhao, Quanming Yao
- Abstract summary: shallow GNNs are more common due to the well-known over-smoothing problem facing deeper GNNs.
We propose a novel approach with the help of neural architecture search (NAS), which is dubbed LRGNN (Long-Range Graph Neural Networks)
- Score: 41.84399177525008
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In recent years, Graph Neural Networks (GNNs) have been popular in the graph
classification task. Currently, shallow GNNs are more common due to the
well-known over-smoothing problem facing deeper GNNs. However, they are
sub-optimal without utilizing the information from distant nodes, i.e., the
long-range dependencies. The mainstream methods in the graph classification
task can extract the long-range dependencies either by designing the pooling
operations or incorporating the higher-order neighbors, while they have evident
drawbacks by modifying the original graph structure, which may result in
information loss in graph structure learning. In this paper, by justifying the
smaller influence of the over-smoothing problem in the graph classification
task, we evoke the importance of stacking-based GNNs and then employ them to
capture the long-range dependencies without modifying the original graph
structure. To achieve this, two design needs are given for stacking-based GNNs,
i.e., sufficient model depth and adaptive skip-connection schemes. By
transforming the two design needs into designing data-specific inter-layer
connections, we propose a novel approach with the help of neural architecture
search (NAS), which is dubbed LRGNN (Long-Range Graph Neural Networks).
Extensive experiments on five datasets show that the proposed LRGNN can achieve
the best performance, and obtained data-specific GNNs with different depth and
skip-connection schemes, which can better capture the long-range dependencies.
Related papers
- Spectral Greedy Coresets for Graph Neural Networks [61.24300262316091]
The ubiquity of large-scale graphs in node-classification tasks hinders the real-world applications of Graph Neural Networks (GNNs)
This paper studies graph coresets for GNNs and avoids the interdependence issue by selecting ego-graphs based on their spectral embeddings.
Our spectral greedy graph coreset (SGGC) scales to graphs with millions of nodes, obviates the need for model pre-training, and applies to low-homophily graphs.
arXiv Detail & Related papers (2024-05-27T17:52:12Z) - Multi-View Subgraph Neural Networks: Self-Supervised Learning with Scarce Labeled Data [24.628203785306233]
We present a novel learning framework called multi-view subgraph neural networks (Muse) for handling long-range dependencies.
By fusing two views of subgraphs, the learned representations can preserve the topological properties of the graph at large.
Experimental results show that Muse outperforms the alternative methods on node classification tasks with limited labeled data.
arXiv Detail & Related papers (2024-04-19T01:36:50Z) - MGNNI: Multiscale Graph Neural Networks with Implicit Layers [53.75421430520501]
implicit graph neural networks (GNNs) have been proposed to capture long-range dependencies in underlying graphs.
We introduce and justify two weaknesses of implicit GNNs: the constrained expressiveness due to their limited effective range for capturing long-range dependencies, and their lack of ability to capture multiscale information on graphs at multiple resolutions.
We propose a multiscale graph neural network with implicit layers (MGNNI) which is able to model multiscale structures on graphs and has an expanded effective range for capturing long-range dependencies.
arXiv Detail & Related papers (2022-10-15T18:18:55Z) - Geodesic Graph Neural Network for Efficient Graph Representation
Learning [34.047527874184134]
We propose an efficient GNN framework called Geodesic GNN (GDGNN)
It injects conditional relationships between nodes into the model without labeling.
Conditioned on the geodesic representations, GDGNN is able to generate node, link, and graph representations that carry much richer structural information than plain GNNs.
arXiv Detail & Related papers (2022-10-06T02:02:35Z) - Automatic Relation-aware Graph Network Proliferation [182.30735195376792]
We propose Automatic Relation-aware Graph Network Proliferation (ARGNP) for efficiently searching GNNs.
These operations can extract hierarchical node/relational information and provide anisotropic guidance for message passing on a graph.
Experiments on six datasets for four graph learning tasks demonstrate that GNNs produced by our method are superior to the current state-of-the-art hand-crafted and search-based GNNs.
arXiv Detail & Related papers (2022-05-31T10:38:04Z) - EIGNN: Efficient Infinite-Depth Graph Neural Networks [51.97361378423152]
Graph neural networks (GNNs) are widely used for modelling graph-structured data in numerous applications.
Motivated by this limitation, we propose a GNN model with infinite depth, which we call Efficient Infinite-Depth Graph Neural Networks (EIGNN)
We show that EIGNN has a better ability to capture long-range dependencies than recent baselines, and consistently achieves state-of-the-art performance.
arXiv Detail & Related papers (2022-02-22T08:16:58Z) - Representing Long-Range Context for Graph Neural Networks with Global
Attention [37.212747564546156]
We propose the use of Transformer-based self-attention to learn long-range pairwise relationships.
Our method, which we call GraphTrans, applies a permutation-invariant Transformer module after a standard GNN module.
Our results suggest that purely-learning-based approaches without graph structure may be suitable for learning high-level, long-range relationships on graphs.
arXiv Detail & Related papers (2022-01-21T18:16:21Z) - Hierarchical graph neural nets can capture long-range interactions [8.067880298298185]
We study hierarchical message passing models that leverage a multi-resolution representation of a given graph.
This facilitates learning of features that span large receptive fields without loss of local information.
We introduce Hierarchical Graph Net (HGNet), which for any two connected nodes guarantees existence of message-passing paths of at most logarithmic length.
arXiv Detail & Related papers (2021-07-15T16:24:22Z) - Scalable Graph Neural Networks for Heterogeneous Graphs [12.44278942365518]
Graph neural networks (GNNs) are a popular class of parametric model for learning over graph-structured data.
Recent work has argued that GNNs primarily use the graph for feature smoothing, and have shown competitive results on benchmark tasks.
In this work, we ask whether these results can be extended to heterogeneous graphs, which encode multiple types of relationship between different entities.
arXiv Detail & Related papers (2020-11-19T06:03:35Z) - Gated Graph Recurrent Neural Networks [176.3960927323358]
We introduce Graph Recurrent Neural Networks (GRNNs) as a general learning framework for graph processes.
To address the problem of vanishing gradients, we put forward GRNNs with three different gating mechanisms: time, node and edge gates.
The numerical results also show that GRNNs outperform GNNs and RNNs, highlighting the importance of taking both the temporal and graph structures of a graph process into account.
arXiv Detail & Related papers (2020-02-03T22:35:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.