From Node Interaction to Hop Interaction: New Effective and Scalable
Graph Learning Paradigm
- URL: http://arxiv.org/abs/2211.11761v3
- Date: Thu, 13 Apr 2023 08:07:32 GMT
- Title: From Node Interaction to Hop Interaction: New Effective and Scalable
Graph Learning Paradigm
- Authors: Jie Chen, Zilong Li, Yin Zhu, Junping Zhang, Jian Pu
- Abstract summary: We propose a novel hop interaction paradigm to address limitations simultaneously.
The core idea is to convert the interaction target among nodes to pre-processed multi-hop features inside each node.
We conduct extensive experiments on 12 benchmark datasets in a wide range of domains, scales, and smoothness of graphs.
- Score: 25.959580336262004
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Existing Graph Neural Networks (GNNs) follow the message-passing mechanism
that conducts information interaction among nodes iteratively. While
considerable progress has been made, such node interaction paradigms still have
the following limitation. First, the scalability limitation precludes the broad
application of GNNs in large-scale industrial settings since the node
interaction among rapidly expanding neighbors incurs high computation and
memory costs. Second, the over-smoothing problem restricts the discrimination
ability of nodes, i.e., node representations of different classes will converge
to indistinguishable after repeated node interactions. In this work, we propose
a novel hop interaction paradigm to address these limitations simultaneously.
The core idea is to convert the interaction target among nodes to pre-processed
multi-hop features inside each node. We design a simple yet effective HopGNN
framework that can easily utilize existing GNNs to achieve hop interaction.
Furthermore, we propose a multi-task learning strategy with a self-supervised
learning objective to enhance HopGNN. We conduct extensive experiments on 12
benchmark datasets in a wide range of domains, scales, and smoothness of
graphs. Experimental results show that our methods achieve superior performance
while maintaining high scalability and efficiency. The code is at
https://github.com/JC-202/HopGNN.
Related papers
- Graph as a feature: improving node classification with non-neural graph-aware logistic regression [2.952177779219163]
Graph-aware Logistic Regression (GLR) is a non-neural model designed for node classification tasks.
Unlike traditional graph algorithms that use only a fraction of the information accessible to GNNs, our proposed model simultaneously leverages both node features and the relationships between entities.
arXiv Detail & Related papers (2024-11-19T08:32:14Z) - Towards Dynamic Message Passing on Graphs [104.06474765596687]
We propose a novel dynamic message-passing mechanism for graph neural networks (GNNs)
It projects graph nodes and learnable pseudo nodes into a common space with measurable spatial relations between them.
With nodes moving in the space, their evolving relations facilitate flexible pathway construction for a dynamic message-passing process.
arXiv Detail & Related papers (2024-10-31T07:20:40Z) - The Snowflake Hypothesis: Training Deep GNN with One Node One Receptive
field [39.679151680622375]
We introduce the Snowflake Hypothesis -- a novel paradigm underpinning the concept of one node, one receptive field''
We employ the simplest gradient and node-level cosine distance as guiding principles to regulate the aggregation depth for each node.
The observational results demonstrate that our hypothesis can serve as a universal operator for a range of tasks.
arXiv Detail & Related papers (2023-08-19T15:21:12Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Discovering the Representation Bottleneck of Graph Neural Networks from
Multi-order Interactions [51.597480162777074]
Graph neural networks (GNNs) rely on the message passing paradigm to propagate node features and build interactions.
Recent works point out that different graph learning tasks require different ranges of interactions between nodes.
We study two common graph construction methods in scientific domains, i.e., emphK-nearest neighbor (KNN) graphs and emphfully-connected (FC) graphs.
arXiv Detail & Related papers (2022-05-15T11:38:14Z) - Node2Seq: Towards Trainable Convolutions in Graph Neural Networks [59.378148590027735]
We propose a graph network layer, known as Node2Seq, to learn node embeddings with explicitly trainable weights for different neighboring nodes.
For a target node, our method sorts its neighboring nodes via attention mechanism and then employs 1D convolutional neural networks (CNNs) to enable explicit weights for information aggregation.
In addition, we propose to incorporate non-local information for feature learning in an adaptive manner based on the attention scores.
arXiv Detail & Related papers (2021-01-06T03:05:37Z) - Adaptive Graph Diffusion Networks with Hop-wise Attention [1.2183405753834562]
We propose Adaptive Graph Diffusion Networks with Hop-wise Attention (AGDNs-HA) to incorporate deeper information.
We show that our proposed methods achieve significant improvements on the standard dataset with semi-supervised node classification task.
arXiv Detail & Related papers (2020-12-30T03:43:04Z) - Towards Deeper Graph Neural Networks with Differentiable Group
Normalization [61.20639338417576]
Graph neural networks (GNNs) learn the representation of a node by aggregating its neighbors.
Over-smoothing is one of the key issues which limit the performance of GNNs as the number of layers increases.
We introduce two over-smoothing metrics and a novel technique, i.e., differentiable group normalization (DGN)
arXiv Detail & Related papers (2020-06-12T07:18:02Z) - SkipGNN: Predicting Molecular Interactions with Skip-Graph Networks [70.64925872964416]
We present SkipGNN, a graph neural network approach for the prediction of molecular interactions.
SkipGNN predicts molecular interactions by not only aggregating information from direct interactions but also from second-order interactions.
We show that SkipGNN achieves superior and robust performance, outperforming existing methods by up to 28.8% of area.
arXiv Detail & Related papers (2020-04-30T16:55:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.