Labeling Trick: A Theory of Using Graph Neural Networks for Multi-Node
Representation Learning
- URL: http://arxiv.org/abs/2010.16103v5
- Date: Sat, 15 Jan 2022 15:23:39 GMT
- Title: Labeling Trick: A Theory of Using Graph Neural Networks for Multi-Node
Representation Learning
- Authors: Muhan Zhang, Pan Li, Yinglong Xia, Kai Wang, Long Jin
- Abstract summary: We provide a theory of using graph neural networks (GNNs) for multi-node representation learning.
A common practice in previous works is to directly aggregate the single-node representations obtained by a GNN into a joint node set representation.
We unify these node labeling techniques into a single and most general form -- labeling trick.
- Score: 26.94699471990803
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we provide a theory of using graph neural networks (GNNs) for
multi-node representation learning (where we are interested in learning a
representation for a set of more than one node, such as link). We know that GNN
is designed to learn single-node representations. When we want to learn a node
set representation involving multiple nodes, a common practice in previous
works is to directly aggregate the single-node representations obtained by a
GNN into a joint node set representation. In this paper, we show a fundamental
constraint of such an approach, namely the inability to capture the dependence
between nodes in the node set, and argue that directly aggregating individual
node representations does not lead to an effective joint representation for
multiple nodes. Then, we notice that a few previous successful works for
multi-node representation learning, including SEAL, Distance Encoding, and
ID-GNN, all used node labeling. These methods first label nodes in the graph
according to their relationships with the target node set before applying a
GNN. Then, the node representations obtained in the labeled graph are
aggregated into a node set representation. By investigating their inner
mechanisms, we unify these node labeling techniques into a single and most
general form -- labeling trick. We prove that with labeling trick a
sufficiently expressive GNN learns the most expressive node set
representations, thus in principle solves any joint learning tasks over node
sets. Experiments on one important two-node representation learning task, link
prediction, verified our theory. Our work explains the superior performance of
previous node-labeling-based methods, and establishes a theoretical foundation
of using GNNs for multi-node representation learning.
Related papers
- Contrastive Meta-Learning for Few-shot Node Classification [54.36506013228169]
Few-shot node classification aims to predict labels for nodes on graphs with only limited labeled nodes as references.
We create a novel contrastive meta-learning framework on graphs, named COSMIC, with two key designs.
arXiv Detail & Related papers (2023-06-27T02:22:45Z) - Seq-HGNN: Learning Sequential Node Representation on Heterogeneous Graph [57.2953563124339]
We propose a novel heterogeneous graph neural network with sequential node representation, namely Seq-HGNN.
We conduct extensive experiments on four widely used datasets from Heterogeneous Graph Benchmark (HGB) and Open Graph Benchmark (OGB)
arXiv Detail & Related papers (2023-05-18T07:27:18Z) - Improving Graph Neural Networks on Multi-node Tasks with Labeling Tricks [14.41064333206723]
We propose textlabeling trick, which first labels nodes in the graph according to their relationships with the target node set before applying a GNN.
Our work explains the superior performance of previous node-labeling-based methods and establishes a theoretical foundation for using GNNs for multi-node representation learning.
arXiv Detail & Related papers (2023-04-20T04:03:40Z) - GraFN: Semi-Supervised Node Classification on Graph with Few Labels via
Non-Parametric Distribution Assignment [5.879936787990759]
We propose a novel semi-supervised method for graphs, GraFN, to ensure nodes that belong to the same class to be grouped together.
GraFN randomly samples support nodes from labeled nodes and anchor nodes from the entire graph.
We experimentally show that GraFN surpasses both the semi-supervised and self-supervised methods in terms of node classification on real-world graphs.
arXiv Detail & Related papers (2022-04-04T08:22:30Z) - Inferential SIR-GN: Scalable Graph Representation Learning [0.4699313647907615]
Graph representation learning methods generate numerical vector representations for the nodes in a network.
In this work, we propose Inferential SIR-GN, a model which is pre-trained on random graphs, then computes node representations rapidly.
We demonstrate that the model is able to capture node's structural role information, and show excellent performance at node and graph classification tasks, on unseen networks.
arXiv Detail & Related papers (2021-11-08T20:56:37Z) - Position-based Hash Embeddings For Scaling Graph Neural Networks [8.87527266373087]
Graph Neural Networks (GNNs) compute node representations by taking into account the topology of the node's ego-network and the features of the ego-network's nodes.
When the nodes do not have high-quality features, GNNs learn an embedding layer to compute node embeddings and use them as input features.
To reduce the memory associated with this embedding layer, hashing-based approaches, commonly used in applications like NLP and recommender systems, can potentially be used.
We present approaches that take advantage of the nodes' position in the graph to dramatically reduce the memory required.
arXiv Detail & Related papers (2021-08-31T22:42:25Z) - Multi-grained Semantics-aware Graph Neural Networks [13.720544777078642]
Graph Neural Networks (GNNs) are powerful techniques in representation learning for graphs.
This work proposes a unified model, AdamGNN, to interactively learn node and graph representations.
Experiments on 14 real-world graph datasets show that AdamGNN can significantly outperform 17 competing models on both node- and graph-wise tasks.
arXiv Detail & Related papers (2020-10-01T07:52:06Z) - CatGCN: Graph Convolutional Networks with Categorical Node Features [99.555850712725]
CatGCN is tailored for graph learning when the node features are categorical.
We train CatGCN in an end-to-end fashion and demonstrate it on semi-supervised node classification.
arXiv Detail & Related papers (2020-09-11T09:25:17Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - Towards Deeper Graph Neural Networks with Differentiable Group
Normalization [61.20639338417576]
Graph neural networks (GNNs) learn the representation of a node by aggregating its neighbors.
Over-smoothing is one of the key issues which limit the performance of GNNs as the number of layers increases.
We introduce two over-smoothing metrics and a novel technique, i.e., differentiable group normalization (DGN)
arXiv Detail & Related papers (2020-06-12T07:18:02Z) - Graph Inference Learning for Semi-supervised Classification [50.55765399527556]
We propose a Graph Inference Learning framework to boost the performance of semi-supervised node classification.
For learning the inference process, we introduce meta-optimization on structure relations from training nodes to validation nodes.
Comprehensive evaluations on four benchmark datasets demonstrate the superiority of our proposed GIL when compared against state-of-the-art methods.
arXiv Detail & Related papers (2020-01-17T02:52:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.