Time-varying Graph Representation Learning via Higher-Order Skip-Gram
with Negative Sampling
- URL: http://arxiv.org/abs/2006.14330v1
- Date: Thu, 25 Jun 2020 12:04:48 GMT
- Title: Time-varying Graph Representation Learning via Higher-Order Skip-Gram
with Negative Sampling
- Authors: Simone Piaggesi, Andr\'e Panisson
- Abstract summary: We build upon the fact that the skip-gram embedding approach implicitly performs a matrix factorization.
We show that higher-order skip-gram with negative sampling is able to disentangle the role of nodes and time.
We empirically evaluate our approach using time-resolved face-to-face proximity data, showing that the learned time-varying graph representations outperform state-of-the-art methods.
- Score: 0.456877715768796
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Representation learning models for graphs are a successful family of
techniques that project nodes into feature spaces that can be exploited by
other machine learning algorithms. Since many real-world networks are
inherently dynamic, with interactions among nodes changing over time, these
techniques can be defined both for static and for time-varying graphs. Here, we
build upon the fact that the skip-gram embedding approach implicitly performs a
matrix factorization, and we extend it to perform implicit tensor factorization
on different tensor representations of time-varying graphs. We show that
higher-order skip-gram with negative sampling (HOSGNS) is able to disentangle
the role of nodes and time, with a small fraction of the number of parameters
needed by other approaches. We empirically evaluate our approach using
time-resolved face-to-face proximity data, showing that the learned
time-varying graph representations outperform state-of-the-art methods when
used to solve downstream tasks such as network reconstruction, and to predict
the outcome of dynamical processes such as disease spreading. The source code
and data are publicly available at https://github.com/simonepiaggesi/hosgns.
Related papers
- NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Graph-Level Embedding for Time-Evolving Graphs [24.194795771873046]
Graph representation learning (also known as network embedding) has been extensively researched with varying levels of granularity.
We present a novel method for temporal graph-level embedding that addresses this gap.
arXiv Detail & Related papers (2023-06-01T01:50:37Z) - Deep Temporal Graph Clustering [77.02070768950145]
We propose a general framework for deep Temporal Graph Clustering (GC)
GC introduces deep clustering techniques to suit the interaction sequence-based batch-processing pattern of temporal graphs.
Our framework can effectively improve the performance of existing temporal graph learning methods.
arXiv Detail & Related papers (2023-05-18T06:17:50Z) - TodyNet: Temporal Dynamic Graph Neural Network for Multivariate Time
Series Classification [6.76723360505692]
We propose a novel temporal dynamic neural graph network (TodyNet) that can extract hidden-temporal dependencies without undefined graph structure.
The experiments on 26 UEA benchmark datasets illustrate that the proposed TodyNet outperforms existing deep learning-based methods in the MTSC tasks.
arXiv Detail & Related papers (2023-04-11T09:21:28Z) - Towards Real-Time Temporal Graph Learning [10.647431919265346]
We propose an end-to-end graph learning pipeline that performs temporal graph construction, creates low-dimensional node embeddings, and trains neural network models in an online setting.
arXiv Detail & Related papers (2022-10-08T22:14:31Z) - Dynamic Graph Message Passing Networks for Visual Recognition [112.49513303433606]
Modelling long-range dependencies is critical for scene understanding tasks in computer vision.
A fully-connected graph is beneficial for such modelling, but its computational overhead is prohibitive.
We propose a dynamic graph message passing network, that significantly reduces the computational complexity.
arXiv Detail & Related papers (2022-09-20T14:41:37Z) - Learning Sparse and Continuous Graph Structures for Multivariate Time
Series Forecasting [5.359968374560132]
Learning Sparse and Continuous Graphs for Forecasting (LSCGF) is a novel deep learning model that joins graph learning and forecasting.
In this paper, we propose a brand new method named Smooth Sparse Unit (SSU) to learn sparse and continuous graph adjacency matrix.
Our model achieves state-of-the-art performances with minor trainable parameters.
arXiv Detail & Related papers (2022-01-24T13:35:37Z) - Dynamic Graph Learning-Neural Network for Multivariate Time Series
Modeling [2.3022070933226217]
We propose a novel framework, namely static- and dynamic-graph learning-neural network (GL)
The model acquires static and dynamic graph matrices from data to model long-term and short-term patterns respectively.
It achieves state-of-the-art performance on almost all datasets.
arXiv Detail & Related papers (2021-12-06T08:19:15Z) - Temporal Graph Network Embedding with Causal Anonymous Walks
Representations [54.05212871508062]
We propose a novel approach for dynamic network representation learning based on Temporal Graph Network.
For evaluation, we provide a benchmark pipeline for the evaluation of temporal network embeddings.
We show the applicability and superior performance of our model in the real-world downstream graph machine learning task provided by one of the top European banks.
arXiv Detail & Related papers (2021-08-19T15:39:52Z) - Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks [91.65637773358347]
We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
arXiv Detail & Related papers (2020-05-24T04:02:18Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.