Omni-Granular Ego-Semantic Propagation for Self-Supervised Graph
Representation Learning
- URL: http://arxiv.org/abs/2205.15746v1
- Date: Tue, 31 May 2022 12:31:33 GMT
- Title: Omni-Granular Ego-Semantic Propagation for Self-Supervised Graph
Representation Learning
- Authors: Ling Yang, Shenda Hong
- Abstract summary: Unsupervised/self-supervised graph representation learning is critical for downstream node- and graph-level classification tasks.
We introduce instance-adaptive global-aware ego-semantic descriptors.
The descriptors can be explicitly integrated into local graph convolution as new neighbor nodes.
- Score: 6.128446481571702
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Unsupervised/self-supervised graph representation learning is critical for
downstream node- and graph-level classification tasks. Global structure of
graphs helps discriminating representations and existing methods mainly utilize
the global structure by imposing additional supervisions. However, their global
semantics are usually invariant for all nodes/graphs and they fail to
explicitly embed the global semantics to enrich the representations. In this
paper, we propose Omni-Granular Ego-Semantic Propagation for Self-Supervised
Graph Representation Learning (OEPG). Specifically, we introduce
instance-adaptive global-aware ego-semantic descriptors, leveraging the first-
and second-order feature differences between each node/graph and hierarchical
global clusters of the entire graph dataset. The descriptors can be explicitly
integrated into local graph convolution as new neighbor nodes. Besides, we
design an omni-granular normalization on the whole scales and hierarchies of
the ego-semantic to assign attentional weight to each descriptor from an
omni-granular perspective. Specialized pretext tasks and cross-iteration
momentum update are further developed for local-global mutual adaptation. In
downstream tasks, OEPG consistently achieves the best performance with a 2%~6%
accuracy gain on multiple datasets cross scales and domains. Notably, OEPG also
generalizes to quantity- and topology-imbalance scenarios.
Related papers
- Federated Graph Semantic and Structural Learning [54.97668931176513]
This paper reveals that local client distortion is brought by both node-level semantics and graph-level structure.
We postulate that a well-structural graph neural network possesses similarity for neighbors due to the inherent adjacency relationships.
We transform the adjacency relationships into the similarity distribution and leverage the global model to distill the relation knowledge into the local model.
arXiv Detail & Related papers (2024-06-27T07:08:28Z) - FedGT: Federated Node Classification with Scalable Graph Transformer [27.50698154862779]
We propose a scalable textbfFederated textbfGraph textbfTransformer (textbfFedGT) in the paper.
FedGT computes clients' similarity based on the aligned global nodes with optimal transport.
arXiv Detail & Related papers (2024-01-26T21:02:36Z) - Generative and Contrastive Paradigms Are Complementary for Graph
Self-Supervised Learning [56.45977379288308]
Masked autoencoder (MAE) learns to reconstruct masked graph edges or node features.
Contrastive Learning (CL) maximizes the similarity between augmented views of the same graph.
We propose graph contrastive masked autoencoder (GCMAE) framework to unify MAE and CL.
arXiv Detail & Related papers (2023-10-24T05:06:06Z) - GraphGLOW: Universal and Generalizable Structure Learning for Graph
Neural Networks [72.01829954658889]
This paper introduces the mathematical definition of this novel problem setting.
We devise a general framework that coordinates a single graph-shared structure learner and multiple graph-specific GNNs.
The well-trained structure learner can directly produce adaptive structures for unseen target graphs without any fine-tuning.
arXiv Detail & Related papers (2023-06-20T03:33:22Z) - Graph Representation Learning via Contrasting Cluster Assignments [57.87743170674533]
We propose a novel unsupervised graph representation model by contrasting cluster assignments, called as GRCCA.
It is motivated to make good use of local and global information synthetically through combining clustering algorithms and contrastive learning.
GRCCA has strong competitiveness in most tasks.
arXiv Detail & Related papers (2021-12-15T07:28:58Z) - Learnable Structural Semantic Readout for Graph Classification [23.78861906423389]
We propose structural semantic readout (SSRead) to summarize the node representations at the position-level.
SSRead aims to identify structurally-meaningful positions by using the semantic alignment between its nodes and structural prototypes.
Our experimental results demonstrate that SSRead significantly improves the classification performance and interpretability of GNN classifiers.
arXiv Detail & Related papers (2021-11-22T20:44:27Z) - Self-supervised Graph-level Representation Learning with Local and
Global Structure [71.45196938842608]
We propose a unified framework called Local-instance and Global-semantic Learning (GraphLoG) for self-supervised whole-graph representation learning.
Besides preserving the local similarities, GraphLoG introduces the hierarchical prototypes to capture the global semantic clusters.
An efficient online expectation-maximization (EM) algorithm is further developed for learning the model.
arXiv Detail & Related papers (2021-06-08T05:25:38Z) - Multi-Level Graph Convolutional Network with Automatic Graph Learning
for Hyperspectral Image Classification [63.56018768401328]
We propose a Multi-level Graph Convolutional Network (GCN) with Automatic Graph Learning method (MGCN-AGL) for HSI classification.
By employing attention mechanism to characterize the importance among spatially neighboring regions, the most relevant information can be adaptively incorporated to make decisions.
Our MGCN-AGL encodes the long range dependencies among image regions based on the expressive representations that have been produced at local level.
arXiv Detail & Related papers (2020-09-19T09:26:20Z) - Self-Supervised Graph Representation Learning via Global Context
Prediction [31.07584920486755]
This paper introduces a novel self-supervised strategy for graph representation learning by exploiting natural supervision provided by the data itself.
We randomly select pairs of nodes in a graph and train a well-designed neural net to predict the contextual position of one node relative to the other.
Our underlying hypothesis is that the representations learned from such within-graph context would capture the global topology of the graph and finely characterize the similarity and differentiation between nodes.
arXiv Detail & Related papers (2020-03-03T15:46:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.