A Self-supervised Riemannian GNN with Time Varying Curvature for
Temporal Graph Learning
- URL: http://arxiv.org/abs/2208.14073v1
- Date: Tue, 30 Aug 2022 08:43:06 GMT
- Title: A Self-supervised Riemannian GNN with Time Varying Curvature for
Temporal Graph Learning
- Authors: Li Sun, Junda Ye, Hao Peng, Philip S. Yu
- Abstract summary: We present a novel self-supervised Riemannian graph neural network (SelfRGNN)
Specifically, we design a curvature-varying GNN with a theoretically grounded time encoding, and formulate a functional curvature over time to model the evolvement shifting among the positive, zero and negative curvature spaces.
Extensive experiments show the superiority of SelfRGNN, and moreover, the case study shows the time-varying curvature of temporal graph in reality.
- Score: 79.20249985327007
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Representation learning on temporal graphs has drawn considerable research
attention owing to its fundamental importance in a wide spectrum of real-world
applications. Though a number of studies succeed in obtaining time-dependent
representations, it still faces significant challenges. On the one hand, most
of the existing methods restrict the embedding space with a certain curvature.
However, the underlying geometry in fact shifts among the positive curvature
hyperspherical, zero curvature Euclidean and negative curvature hyperbolic
spaces in the evolvement over time. On the other hand, these methods usually
require abundant labels to learn temporal representations, and thereby notably
limit their wide use in the unlabeled graphs of the real applications. To
bridge this gap, we make the first attempt to study the problem of
self-supervised temporal graph representation learning in the general
Riemannian space, supporting the time-varying curvature to shift among
hyperspherical, Euclidean and hyperbolic spaces. In this paper, we present a
novel self-supervised Riemannian graph neural network (SelfRGNN). Specifically,
we design a curvature-varying Riemannian GNN with a theoretically grounded time
encoding, and formulate a functional curvature over time to model the
evolvement shifting among the positive, zero and negative curvature spaces. To
enable the self-supervised learning, we propose a novel reweighting
self-contrastive approach, exploring the Riemannian space itself without
augmentation, and propose an edge-based self-supervised curvature learning with
the Ricci curvature. Extensive experiments show the superiority of SelfRGNN,
and moreover, the case study shows the time-varying curvature of temporal graph
in reality.
Related papers
- Revealing Decurve Flows for Generalized Graph Propagation [108.80758541147418]
This study addresses the limitations of the traditional analysis of message-passing, central to graph learning, by defining em textbfgeneralized propagation with directed and weighted graphs.
We include a preliminary exploration of learned propagation patterns in datasets, a first in the field.
arXiv Detail & Related papers (2024-02-13T14:13:17Z) - DeepRicci: Self-supervised Graph Structure-Feature Co-Refinement for
Alleviating Over-squashing [72.70197960100677]
Graph Structure Learning (GSL) plays an important role in boosting Graph Neural Networks (GNNs) with a refined graph.
GSL solutions usually focus on structure refinement with task-specific supervision (i.e., node classification) or overlook the inherent weakness of GNNs themselves.
We propose to study self-supervised graph structure-feature co-refinement for effectively alleviating the issue of over-squashing in typical GNNs.
arXiv Detail & Related papers (2024-01-23T14:06:08Z) - Contrastive Graph Clustering in Curvature Spaces [74.03252813800334]
We present a novel end-to-end contrastive graph clustering model named CONGREGATE.
To support geometric clustering, we construct a theoretically grounded Heterogeneous Curvature Space.
We then train the graph clusters by an augmentation-free reweighted contrastive approach.
arXiv Detail & Related papers (2023-05-05T14:04:52Z) - Self-Supervised Continual Graph Learning in Adaptive Riemannian Spaces [74.03252813800334]
Continual graph learning routinely finds its role in a variety of real-world applications where the graph data with different tasks come sequentially.
Existing methods work with the zero-curvature Euclidean space, and largely ignore the fact that curvature varies over the coming graph sequence.
To address the aforementioned challenges, we propose to explore a challenging yet practical problem, the self-supervised continual graph learning.
arXiv Detail & Related papers (2022-11-30T15:25:27Z) - STONet: A Neural-Operator-Driven Spatio-temporal Network [38.5696882090282]
Graph-based graph-temporal neural networks are effective to model spatial dependency among discrete points sampled irregularly.
We propose atemporal framework based on neural operators for PDEs, which learn the mechanisms governing the dynamics of spatially-continuous physical quantities.
Experiments show our model's performance on forecasting spatially-continuous physic quantities, and its superior to unseen spatial points and ability to handle temporally-irregular data.
arXiv Detail & Related papers (2022-04-18T17:20:12Z) - A Self-supervised Mixed-curvature Graph Neural Network [76.3790248465522]
We present a novel Self-supervised Mixed-curvature Graph Neural Network (SelfMGNN)
We show that SelfMGNN captures the complicated graph structures in reality and outperforms state-of-the-art baselines.
arXiv Detail & Related papers (2021-12-10T08:56:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.