On the Equivalence Between Temporal and Static Graph Representations for
Observational Predictions
- URL: http://arxiv.org/abs/2103.07016v2
- Date: Mon, 27 Mar 2023 21:15:57 GMT
- Title: On the Equivalence Between Temporal and Static Graph Representations for
Observational Predictions
- Authors: Jianfei Gao, Bruno Ribeiro
- Abstract summary: We show that node representations in temporal graphs can be cast into two distinct frameworks: time-and-graph and time-then-graph.
We show that time-then-graph methods are capable of achieving better performance and efficiency than state-of-the-art time-and-graph methods in some real-world tasks.
- Score: 10.759470206355145
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: This work formalizes the associational task of predicting node attribute
evolution in temporal graphs from the perspective of learning equivariant
representations. We show that node representations in temporal graphs can be
cast into two distinct frameworks: (a) The most popular approach, which we
denote as time-and-graph, where equivariant graph (e.g., GNN) and sequence
(e.g., RNN) representations are intertwined to represent the temporal evolution
of node attributes in the graph; and (b) an approach that we denote as
time-then-graph, where the sequences describing the node and edge dynamics are
represented first, then fed as node and edge attributes into a static
equivariant graph representation that comes after. Interestingly, we show that
time-then-graph representations have an expressivity advantage over
time-and-graph representations when both use component GNNs that are not
most-expressive (e.g., 1-Weisfeiler-Lehman GNNs). Moreover, while our goal is
not necessarily to obtain state-of-the-art results, our experiments show that
time-then-graph methods are capable of achieving better performance and
efficiency than state-of-the-art time-and-graph methods in some real-world
tasks, thereby showcasing that the time-then-graph framework is a worthy
addition to the graph ML toolbox.
Related papers
- Weisfeiler and Leman Follow the Arrow of Time: Expressive Power of Message Passing in Temporal Event Graphs [2.9561405287476177]
We introduce the notion of consistent event graph isomorphism, which utilizes a time-unfolded representation of time-respecting paths in temporal graphs.<n>We derive a novel message passing scheme for temporal graph neural networks that operates on the event graph representation of temporal graphs.
arXiv Detail & Related papers (2025-05-30T10:20:30Z) - Temporal Graph ODEs for Irregularly-Sampled Time Series [32.68671699403658]
We introduce the Temporal Graph Ordinary Differential Equation (TG-ODE) framework, which learns both the temporal and spatial dynamics from graph streams where the intervals between observations are not regularly spaced.
We empirically validate the proposed approach on several graph benchmarks, showing that TG-ODE can achieve state-of-the-art performance in irregular graph stream tasks.
arXiv Detail & Related papers (2024-04-30T12:43:11Z) - Graph-Level Embedding for Time-Evolving Graphs [24.194795771873046]
Graph representation learning (also known as network embedding) has been extensively researched with varying levels of granularity.
We present a novel method for temporal graph-level embedding that addresses this gap.
arXiv Detail & Related papers (2023-06-01T01:50:37Z) - Time-aware Dynamic Graph Embedding for Asynchronous Structural Evolution [60.695162101159134]
Existing works merely view a dynamic graph as a sequence of changes.
We formulate dynamic graphs as temporal edge sequences associated with joining time of.
vertex and timespan of edges.
A time-aware Transformer is proposed to embed.
vertex' dynamic connections and ToEs into the learned.
vertex representations.
arXiv Detail & Related papers (2022-07-01T15:32:56Z) - Similarity-aware Positive Instance Sampling for Graph Contrastive
Pre-training [82.68805025636165]
We propose to select positive graph instances directly from existing graphs in the training set.
Our selection is based on certain domain-specific pair-wise similarity measurements.
Besides, we develop an adaptive node-level pre-training method to dynamically mask nodes to distribute them evenly in the graph.
arXiv Detail & Related papers (2022-06-23T20:12:51Z) - Instant Graph Neural Networks for Dynamic Graphs [18.916632816065935]
We propose Instant Graph Neural Network (InstantGNN), an incremental approach for the graph representation matrix of dynamic graphs.
Our method avoids time-consuming, repetitive computations and allows instant updates on the representation and instant predictions.
Our model achieves state-of-the-art accuracy while having orders-of-magnitude higher efficiency than existing methods.
arXiv Detail & Related papers (2022-06-03T03:27:42Z) - Self-Supervised Dynamic Graph Representation Learning via Temporal
Subgraph Contrast [0.8379286663107846]
This paper proposes a self-supervised dynamic graph representation learning framework (DySubC)
DySubC defines a temporal subgraph contrastive learning task to simultaneously learn the structural and evolutional features of a dynamic graph.
Experiments on five real-world datasets demonstrate that DySubC performs better than the related baselines.
arXiv Detail & Related papers (2021-12-16T09:35:34Z) - Accurate Learning of Graph Representations with Graph Multiset Pooling [45.72542969364438]
We propose a Graph Multiset Transformer (GMT) that captures the interaction between nodes according to their structural dependencies.
Our experimental results show that GMT significantly outperforms state-of-the-art graph pooling methods on graph classification benchmarks.
arXiv Detail & Related papers (2021-02-23T07:45:58Z) - From Static to Dynamic Node Embeddings [61.58641072424504]
We introduce a general framework for leveraging graph stream data for temporal prediction-based applications.
Our proposed framework includes novel methods for learning an appropriate graph time-series representation.
We find that the top-3 temporal models are always those that leverage the new $epsilon$-graph time-series representation.
arXiv Detail & Related papers (2020-09-21T16:48:29Z) - Fast Graph Attention Networks Using Effective Resistance Based Graph
Sparsification [70.50751397870972]
FastGAT is a method to make attention based GNNs lightweight by using spectral sparsification to generate an optimal pruning of the input graph.
We experimentally evaluate FastGAT on several large real world graph datasets for node classification tasks.
arXiv Detail & Related papers (2020-06-15T22:07:54Z) - Structural Temporal Graph Neural Networks for Anomaly Detection in
Dynamic Graphs [54.13919050090926]
We propose an end-to-end structural temporal Graph Neural Network model for detecting anomalous edges in dynamic graphs.
In particular, we first extract the $h$-hop enclosing subgraph centered on the target edge and propose the node labeling function to identify the role of each node in the subgraph.
Based on the extracted features, we utilize Gated recurrent units (GRUs) to capture the temporal information for anomaly detection.
arXiv Detail & Related papers (2020-05-15T09:17:08Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.