Self-Supervised Node Representation Learning via Node-to-Neighbourhood
Alignment
- URL: http://arxiv.org/abs/2302.04626v2
- Date: Fri, 10 Feb 2023 03:49:34 GMT
- Title: Self-Supervised Node Representation Learning via Node-to-Neighbourhood
Alignment
- Authors: Wei Dong, Dawei Yan, and Peng Wang
- Abstract summary: Self-supervised node representation learning aims to learn node representations from unlabelled graphs that rival the supervised counterparts.
In this work, we present simple-yet-effective self-supervised node representation learning via aligning the hidden representations of nodes and their neighbourhood.
We learn node representations that achieve promising node classification performance on a set of graph-structured datasets from small- to large-scale.
- Score: 10.879056662671802
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Self-supervised node representation learning aims to learn node
representations from unlabelled graphs that rival the supervised counterparts.
The key towards learning informative node representations lies in how to
effectively gain contextual information from the graph structure. In this work,
we present simple-yet-effective self-supervised node representation learning
via aligning the hidden representations of nodes and their neighbourhood. Our
first idea achieves such node-to-neighbourhood alignment by directly maximizing
the mutual information between their representations, which, we prove
theoretically, plays the role of graph smoothing. Our framework is optimized
via a surrogate contrastive loss and a Topology-Aware Positive Sampling (TAPS)
strategy is proposed to sample positives by considering the structural
dependencies between nodes, which enables offline positive selection.
Considering the excessive memory overheads of contrastive learning, we further
propose a negative-free solution, where the main contribution is a Graph Signal
Decorrelation (GSD) constraint to avoid representation collapse and
over-smoothing. The GSD constraint unifies some of the existing constraints and
can be used to derive new implementations to combat representation collapse. By
applying our methods on top of simple MLP-based node representation encoders,
we learn node representations that achieve promising node classification
performance on a set of graph-structured datasets from small- to large-scale.
Related papers
- Contrastive Graph Representation Learning with Adversarial Cross-view Reconstruction and Information Bottleneck [5.707725771108279]
We propose an effective Contrastive Graph Representation Learning with Adversarial Cross-view Reconstruction and Information Bottleneck (CGRL) for node classification.
Our method significantly outperforms existing state-of-the-art algorithms.
arXiv Detail & Related papers (2024-08-01T05:45:21Z) - STERLING: Synergistic Representation Learning on Bipartite Graphs [78.86064828220613]
A fundamental challenge of bipartite graph representation learning is how to extract node embeddings.
Most recent bipartite graph SSL methods are based on contrastive learning which learns embeddings by discriminating positive and negative node pairs.
We introduce a novel synergistic representation learning model (STERLING) to learn node embeddings without negative node pairs.
arXiv Detail & Related papers (2023-01-25T03:21:42Z) - Localized Contrastive Learning on Graphs [110.54606263711385]
We introduce a simple yet effective contrastive model named Localized Graph Contrastive Learning (Local-GCL)
In spite of its simplicity, Local-GCL achieves quite competitive performance in self-supervised node representation learning tasks on graphs with various scales and properties.
arXiv Detail & Related papers (2022-12-08T23:36:00Z) - Interpretable Node Representation with Attribute Decoding [20.591882093727413]
We show that attribute decoding is important for node representation learning.
We propose a new learning model, interpretable NOde Representation with Attribute Decoding (NORAD)
arXiv Detail & Related papers (2022-12-03T20:20:24Z) - Node Representation Learning in Graph via Node-to-Neighbourhood Mutual
Information Maximization [27.701736055800314]
Key towards learning informative node representations in graphs lies in how to gain contextual information from the neighbourhood.
We present a self-supervised node representation learning strategy via directly maximizing the mutual information between the hidden representations of nodes and their neighbourhood.
Our framework is optimized via a surrogate contrastive loss, where the positive selection underpins the quality and efficiency of representation learning.
arXiv Detail & Related papers (2022-03-23T08:21:10Z) - Inferential SIR-GN: Scalable Graph Representation Learning [0.4699313647907615]
Graph representation learning methods generate numerical vector representations for the nodes in a network.
In this work, we propose Inferential SIR-GN, a model which is pre-trained on random graphs, then computes node representations rapidly.
We demonstrate that the model is able to capture node's structural role information, and show excellent performance at node and graph classification tasks, on unseen networks.
arXiv Detail & Related papers (2021-11-08T20:56:37Z) - Uniting Heterogeneity, Inductiveness, and Efficiency for Graph
Representation Learning [68.97378785686723]
graph neural networks (GNNs) have greatly advanced the performance of node representation learning on graphs.
A majority class of GNNs are only designed for homogeneous graphs, leading to inferior adaptivity to the more informative heterogeneous graphs.
We propose a novel inductive, meta path-free message passing scheme that packs up heterogeneous node features with their associated edges from both low- and high-order neighbor nodes.
arXiv Detail & Related papers (2021-04-04T23:31:39Z) - Learning the Implicit Semantic Representation on Graph-Structured Data [57.670106959061634]
Existing representation learning methods in graph convolutional networks are mainly designed by describing the neighborhood of each node as a perceptual whole.
We propose a Semantic Graph Convolutional Networks (SGCN) that explores the implicit semantics by learning latent semantic-paths in graphs.
arXiv Detail & Related papers (2021-01-16T16:18:43Z) - Graph Contrastive Learning with Adaptive Augmentation [23.37786673825192]
We propose a novel graph contrastive representation learning method with adaptive augmentation.
Specifically, we design augmentation schemes based on node centrality measures to highlight important connective structures.
Our proposed method consistently outperforms existing state-of-the-art baselines and even surpasses some supervised counterparts.
arXiv Detail & Related papers (2020-10-27T15:12:21Z) - Self-supervised Graph Learning for Recommendation [69.98671289138694]
We explore self-supervised learning on user-item graph for recommendation.
An auxiliary self-supervised task reinforces node representation learning via self-discrimination.
Empirical studies on three benchmark datasets demonstrate the effectiveness of SGL.
arXiv Detail & Related papers (2020-10-21T06:35:26Z) - Graph Inference Learning for Semi-supervised Classification [50.55765399527556]
We propose a Graph Inference Learning framework to boost the performance of semi-supervised node classification.
For learning the inference process, we introduce meta-optimization on structure relations from training nodes to validation nodes.
Comprehensive evaluations on four benchmark datasets demonstrate the superiority of our proposed GIL when compared against state-of-the-art methods.
arXiv Detail & Related papers (2020-01-17T02:52:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.