Continuous-time Graph Representation with Sequential Survival Process
- URL: http://arxiv.org/abs/2312.13068v1
- Date: Wed, 20 Dec 2023 14:46:54 GMT
- Title: Continuous-time Graph Representation with Sequential Survival Process
- Authors: Abdulkadir Celikkanat and Nikolaos Nakis and Morten M{\o}rup
- Abstract summary: We propose a process relying on survival functions to model the durations of links and their absences over time.
GraSSP: Graph Representation with Sequential Survival Process forms a generic new likelihood explicitly accounting for intermittent edge-persistent networks.
We quantitatively assess the developed framework in various downstream tasks, such as link prediction and network completion.
- Score: 0.17265013728931003
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Over the past two decades, there has been a tremendous increase in the growth
of representation learning methods for graphs, with numerous applications
across various fields, including bioinformatics, chemistry, and the social
sciences. However, current dynamic network approaches focus on discrete-time
networks or treat links in continuous-time networks as instantaneous events.
Therefore, these approaches have limitations in capturing the persistence or
absence of links that continuously emerge and disappear over time for
particular durations. To address this, we propose a novel stochastic process
relying on survival functions to model the durations of links and their
absences over time. This forms a generic new likelihood specification
explicitly accounting for intermittent edge-persistent networks, namely GraSSP:
Graph Representation with Sequential Survival Process. We apply the developed
framework to a recent continuous time dynamic latent distance model
characterizing network dynamics in terms of a sequence of piecewise linear
movements of nodes in latent space. We quantitatively assess the developed
framework in various downstream tasks, such as link prediction and network
completion, demonstrating that the developed modeling framework accounting for
link persistence and absence well tracks the intrinsic trajectories of nodes in
a latent space and captures the underlying characteristics of evolving network
structure.
Related papers
- Learning Time-aware Graph Structures for Spatially Correlated Time
Series Forecasting [30.93275270960829]
We propose Time-aware Graph Structure Learning (TagSL), which extracts time-aware correlations among time series.
We also present a Graph Convolution-based Gated Recurrent Unit (GCGRU), that jointly captures spatial and temporal dependencies.
Finally, we introduce a unified framework named Time-aware Graph Convolutional Recurrent Network (TGCRN), combining TagSL, GCGRU in an encoder-decoder architecture for multi-step-temporal forecasting.
arXiv Detail & Related papers (2023-12-27T04:23:43Z) - TempGNN: Temporal Graph Neural Networks for Dynamic Session-Based
Recommendations [5.602191038593571]
Temporal Graph Neural Networks (TempGNN) is a generic framework for capturing the structural and temporal dynamics in complex item transitions.
TempGNN achieves state-of-the-art performance on two real-world e-commerce datasets.
arXiv Detail & Related papers (2023-10-20T03:13:10Z) - Temporal Aggregation and Propagation Graph Neural Networks for Dynamic
Representation [67.26422477327179]
Temporal graphs exhibit dynamic interactions between nodes over continuous time.
We propose a novel method of temporal graph convolution with the whole neighborhood.
Our proposed TAP-GNN outperforms existing temporal graph methods by a large margin in terms of both predictive performance and online inference latency.
arXiv Detail & Related papers (2023-04-15T08:17:18Z) - Piecewise-Velocity Model for Learning Continuous-time Dynamic Node
Representations [0.0]
Piecewise-Veable Model (PiVeM) for representation of continuous-time dynamic networks.
We show that PiVeM can successfully represent network structure and dynamics in ultra-low two-dimensional spaces.
It outperforms relevant state-of-art methods in downstream tasks such as link prediction.
arXiv Detail & Related papers (2022-12-23T13:57:56Z) - Graph-Survival: A Survival Analysis Framework for Machine Learning on
Temporal Networks [14.430635608400982]
We propose a framework for designing generative models for continuous time temporal networks.
We propose a fitting method for models within this framework, and an algorithm for simulating new temporal networks having desired properties.
arXiv Detail & Related papers (2022-03-14T16:40:57Z) - Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs [65.18780403244178]
We propose a continuous model to forecast Multivariate Time series with dynamic Graph neural Ordinary Differential Equations (MTGODE)
Specifically, we first abstract multivariate time series into dynamic graphs with time-evolving node features and unknown graph structures.
Then, we design and solve a neural ODE to complement missing graph topologies and unify both spatial and temporal message passing.
arXiv Detail & Related papers (2022-02-17T02:17:31Z) - Spatio-Temporal Joint Graph Convolutional Networks for Traffic
Forecasting [75.10017445699532]
Recent have shifted their focus towards formulating traffic forecasting as atemporal graph modeling problem.
We propose a novel approach for accurate traffic forecasting on road networks over multiple future time steps.
arXiv Detail & Related papers (2021-11-25T08:45:14Z) - WalkingTime: Dynamic Graph Embedding Using Temporal-Topological Flows [3.8073142980733]
We propose a novel embedding algorithm, WalkingTime, based on a fundamentally different handling of time.
We hold flows comprised of temporally and topologically local interactions as our primitives, without any discretization or alignment of time-related attributes being necessary.
arXiv Detail & Related papers (2021-11-22T00:04:02Z) - Continuity-Discrimination Convolutional Neural Network for Visual Object
Tracking [150.51667609413312]
This paper proposes a novel model, named Continuity-Discrimination Convolutional Neural Network (CD-CNN) for visual object tracking.
To address this problem, CD-CNN models temporal appearance continuity based on the idea of temporal slowness.
In order to alleviate inaccurate target localization and drifting, we propose a novel notion, object-centroid.
arXiv Detail & Related papers (2021-04-18T06:35:03Z) - Radflow: A Recurrent, Aggregated, and Decomposable Model for Networks of
Time Series [77.47313102926017]
Radflow is a novel model for networks of time series that influence each other.
It embodies three key ideas: a recurrent neural network to obtain node embeddings that depend on time, the aggregation of the flow of influence from neighboring nodes with multi-head attention, and the multi-layer decomposition of time series.
We show that Radflow can learn different trends and seasonal patterns, that it is robust to missing nodes and edges, and that correlated temporal patterns among network neighbors reflect influence strength.
arXiv Detail & Related papers (2021-02-15T00:57:28Z) - Continuous-in-Depth Neural Networks [107.47887213490134]
We first show that ResNets fail to be meaningful dynamical in this richer sense.
We then demonstrate that neural network models can learn to represent continuous dynamical systems.
We introduce ContinuousNet as a continuous-in-depth generalization of ResNet architectures.
arXiv Detail & Related papers (2020-08-05T22:54:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.