Motif-aware Riemannian Graph Neural Network with Generative-Contrastive
Learning
- URL: http://arxiv.org/abs/2401.01232v1
- Date: Tue, 2 Jan 2024 14:58:26 GMT
- Title: Motif-aware Riemannian Graph Neural Network with Generative-Contrastive
Learning
- Authors: Li Sun, Zhenhao Huang, Zixi Wang, Feiyang Wang, Hao Peng, Philip Yu
- Abstract summary: We present a novel method for capturing motif regularity in a diverse-curvature manifold without labels.
We also introduce a motif-aware generative-contrastive learning to capture motif regularity in the constructed manifold.
Empirical results show the superiority of MofitRGC over D-GCN.
- Score: 23.041843981988503
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Graphs are typical non-Euclidean data of complex structures. In recent years,
Riemannian graph representation learning has emerged as an exciting alternative
to Euclidean ones. However, Riemannian methods are still in an early stage:
most of them present a single curvature (radius) regardless of structural
complexity, suffer from numerical instability due to the
exponential/logarithmic map, and lack the ability to capture motif regularity.
In light of the issues above, we propose the problem of \emph{Motif-aware
Riemannian Graph Representation Learning}, seeking a numerically stable encoder
to capture motif regularity in a diverse-curvature manifold without labels. To
this end, we present a novel Motif-aware Riemannian model with
Generative-Contrastive learning (MotifRGC), which conducts a minmax game in
Riemannian manifold in a self-supervised manner. First, we propose a new type
of Riemannian GCN (D-GCN), in which we construct a diverse-curvature manifold
by a product layer with the diversified factor, and replace the
exponential/logarithmic map by a stable kernel layer. Second, we introduce a
motif-aware Riemannian generative-contrastive learning to capture motif
regularity in the constructed manifold and learn motif-aware node
representation without external labels. Empirical results show the superiority
of MofitRGC.
Related papers
- A singular Riemannian Geometry Approach to Deep Neural Networks III. Piecewise Differentiable Layers and Random Walks on $n$-dimensional Classes [49.32130498861987]
We study the case of non-differentiable activation functions, such as ReLU.
Two recent works introduced a geometric framework to study neural networks.
We illustrate our findings with some numerical experiments on classification of images and thermodynamic problems.
arXiv Detail & Related papers (2024-04-09T08:11:46Z) - DeepRicci: Self-supervised Graph Structure-Feature Co-Refinement for
Alleviating Over-squashing [72.70197960100677]
Graph Structure Learning (GSL) plays an important role in boosting Graph Neural Networks (GNNs) with a refined graph.
GSL solutions usually focus on structure refinement with task-specific supervision (i.e., node classification) or overlook the inherent weakness of GNNs themselves.
We propose to study self-supervised graph structure-feature co-refinement for effectively alleviating the issue of over-squashing in typical GNNs.
arXiv Detail & Related papers (2024-01-23T14:06:08Z) - Curvature-Independent Last-Iterate Convergence for Games on Riemannian
Manifolds [77.4346324549323]
We show that a step size agnostic to the curvature of the manifold achieves a curvature-independent and linear last-iterate convergence rate.
To the best of our knowledge, the possibility of curvature-independent rates and/or last-iterate convergence has not been considered before.
arXiv Detail & Related papers (2023-06-29T01:20:44Z) - Self-Supervised Continual Graph Learning in Adaptive Riemannian Spaces [74.03252813800334]
Continual graph learning routinely finds its role in a variety of real-world applications where the graph data with different tasks come sequentially.
Existing methods work with the zero-curvature Euclidean space, and largely ignore the fact that curvature varies over the coming graph sequence.
To address the aforementioned challenges, we propose to explore a challenging yet practical problem, the self-supervised continual graph learning.
arXiv Detail & Related papers (2022-11-30T15:25:27Z) - Riemannian Score-Based Generative Modeling [56.20669989459281]
We introduce score-based generative models (SGMs) demonstrating remarkable empirical performance.
Current SGMs make the underlying assumption that the data is supported on a Euclidean manifold with flat geometry.
This prevents the use of these models for applications in robotics, geoscience or protein modeling.
arXiv Detail & Related papers (2022-02-06T11:57:39Z) - A Self-supervised Mixed-curvature Graph Neural Network [76.3790248465522]
We present a novel Self-supervised Mixed-curvature Graph Neural Network (SelfMGNN)
We show that SelfMGNN captures the complicated graph structures in reality and outperforms state-of-the-art baselines.
arXiv Detail & Related papers (2021-12-10T08:56:55Z) - Semi-Riemannian Graph Convolutional Networks [36.09315878397234]
We develop a principled Semi-Riemannian GCN that first models data in semi-Riemannian manifold of constant nonzero curvature.
Our method provides a geometric inductive bias that is sufficiently flexible to model mixed heterogeneous topologies like hierarchical graphs with cycles.
arXiv Detail & Related papers (2021-06-06T14:23:34Z) - Conformal retrofitting via Riemannian manifolds: distilling
task-specific graphs into pretrained embeddings [1.2970250708769708]
Pretrained embeddings are versatile, task-agnostic feature representations of entities, like words, that are central to many machine learning applications.
Existing retrofitting algorithms face two limitations: they overfit the observed graph by failing to represent relationships with missing entities.
We propose a novel regularizer, a conformality regularizer, that preserves local geometry from the pretrained embeddings, and a new feedforward layer that learns to map pre-trained embeddings onto a non-Euclidean manifold.
arXiv Detail & Related papers (2020-10-09T23:06:57Z) - Computationally Tractable Riemannian Manifolds for Graph Embeddings [10.420394952839242]
We show how to learn and optimize graph embeddings in certain curved Riemannian spaces.
Our results serve as new evidence for the benefits of non-Euclidean embeddings in machine learning pipelines.
arXiv Detail & Related papers (2020-02-20T10:55:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.