A Self-supervised Mixed-curvature Graph Neural Network
- URL: http://arxiv.org/abs/2112.05393v1
- Date: Fri, 10 Dec 2021 08:56:55 GMT
- Title: A Self-supervised Mixed-curvature Graph Neural Network
- Authors: Li Sun, Zhongbao Zhang, Junda Ye, Hao Peng, Jiawei Zhang, Sen Su,
Philip S. Yu
- Abstract summary: We present a novel Self-supervised Mixed-curvature Graph Neural Network (SelfMGNN)
We show that SelfMGNN captures the complicated graph structures in reality and outperforms state-of-the-art baselines.
- Score: 76.3790248465522
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Graph representation learning received increasing attentions in recent years.
Most of existing methods ignore the complexity of the graph structures and
restrict graphs in a single constant-curvature representation space, which is
only suitable to particular kinds of graph structure indeed. Additionally,
these methods follow the supervised or semi-supervised learning paradigm, and
thereby notably limit their deployment on the unlabeled graphs in real
applications. To address these aforementioned limitations, we take the first
attempt to study the self-supervised graph representation learning in the
mixed-curvature spaces. In this paper, we present a novel Self-supervised
Mixed-curvature Graph Neural Network (SelfMGNN). Instead of working on one
single constant-curvature space, we construct a mixed-curvature space via the
Cartesian product of multiple Riemannian component spaces and design
hierarchical attention mechanisms for learning and fusing the representations
across these component spaces. To enable the self-supervisd learning, we
propose a novel dual contrastive approach. The mixed-curvature Riemannian space
actually provides multiple Riemannian views for the contrastive learning. We
introduce a Riemannian projector to reveal these views, and utilize a
well-designed Riemannian discriminator for the single-view and cross-view
contrastive learning within and across the Riemannian views. Finally, extensive
experiments show that SelfMGNN captures the complicated graph structures in
reality and outperforms state-of-the-art baselines.
Related papers
- Motif-aware Riemannian Graph Neural Network with Generative-Contrastive
Learning [23.041843981988503]
We present a novel method for capturing motif regularity in a diverse-curvature manifold without labels.
We also introduce a motif-aware generative-contrastive learning to capture motif regularity in the constructed manifold.
Empirical results show the superiority of MofitRGC over D-GCN.
arXiv Detail & Related papers (2024-01-02T14:58:26Z) - Self-Supervised Continual Graph Learning in Adaptive Riemannian Spaces [74.03252813800334]
Continual graph learning routinely finds its role in a variety of real-world applications where the graph data with different tasks come sequentially.
Existing methods work with the zero-curvature Euclidean space, and largely ignore the fact that curvature varies over the coming graph sequence.
To address the aforementioned challenges, we propose to explore a challenging yet practical problem, the self-supervised continual graph learning.
arXiv Detail & Related papers (2022-11-30T15:25:27Z) - Geometry Contrastive Learning on Heterogeneous Graphs [50.58523799455101]
This paper proposes a novel self-supervised learning method, termed as Geometry Contrastive Learning (GCL)
GCL views a heterogeneous graph from Euclidean and hyperbolic perspective simultaneously, aiming to make a strong merger of the ability of modeling rich semantics and complex structures.
Extensive experiments on four benchmarks data sets show that the proposed approach outperforms the strong baselines.
arXiv Detail & Related papers (2022-06-25T03:54:53Z) - Group Contrastive Self-Supervised Learning on Graphs [101.45974132613293]
We study self-supervised learning on graphs using contrastive methods.
We argue that contrasting graphs in multiple subspaces enables graph encoders to capture more abundant characteristics.
arXiv Detail & Related papers (2021-07-20T22:09:21Z) - Multi-Scale Contrastive Siamese Networks for Self-Supervised Graph
Representation Learning [48.09362183184101]
We propose a novel self-supervised approach to learn node representations by enhancing Siamese self-distillation with multi-scale contrastive learning.
Our method achieves new state-of-the-art results and surpasses some semi-supervised counterparts by large margins.
arXiv Detail & Related papers (2021-05-12T14:20:13Z) - Spatial-spectral Hyperspectral Image Classification via Multiple Random
Anchor Graphs Ensemble Learning [88.60285937702304]
This paper proposes a novel spatial-spectral HSI classification method via multiple random anchor graphs ensemble learning (RAGE)
Firstly, the local binary pattern is adopted to extract the more descriptive features on each selected band, which preserves local structures and subtle changes of a region.
Secondly, the adaptive neighbors assignment is introduced in the construction of anchor graph, to reduce the computational complexity.
arXiv Detail & Related papers (2021-03-25T09:31:41Z) - Conformal retrofitting via Riemannian manifolds: distilling
task-specific graphs into pretrained embeddings [1.2970250708769708]
Pretrained embeddings are versatile, task-agnostic feature representations of entities, like words, that are central to many machine learning applications.
Existing retrofitting algorithms face two limitations: they overfit the observed graph by failing to represent relationships with missing entities.
We propose a novel regularizer, a conformality regularizer, that preserves local geometry from the pretrained embeddings, and a new feedforward layer that learns to map pre-trained embeddings onto a non-Euclidean manifold.
arXiv Detail & Related papers (2020-10-09T23:06:57Z) - Graph Representation Learning via Graphical Mutual Information
Maximization [86.32278001019854]
We propose a novel concept, Graphical Mutual Information (GMI), to measure the correlation between input graphs and high-level hidden representations.
We develop an unsupervised learning model trained by maximizing GMI between the input and output of a graph neural encoder.
arXiv Detail & Related papers (2020-02-04T08:33:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.