Generating fine-grained surrogate temporal networks
- URL: http://arxiv.org/abs/2205.08820v2
- Date: Tue, 22 Aug 2023 17:35:58 GMT
- Title: Generating fine-grained surrogate temporal networks
- Authors: Antonio Longa, Giulia Cencetti, Sune Lehmann, Andrea Passerini and
Bruno Lepri
- Abstract summary: We propose a novel and simple method for generating surrogate temporal networks.
Our method decomposes the input network into star-like structures evolving in time.
Then those structures are used as building blocks to generate a surrogate temporal network.
- Score: 12.7211231166069
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Temporal networks are essential for modeling and understanding systems whose
behavior varies in time, from social interactions to biological systems. Often,
however, real-world data are prohibitively expensive to collect in a large
scale or unshareable due to privacy concerns. A promising way to bypass the
problem consists in generating arbitrarily large and anonymized synthetic
graphs with the properties of real-world networks, namely `surrogate networks'.
Until now, the generation of realistic surrogate temporal networks has remained
an open problem, due to the difficulty of capturing both the temporal and
topological properties of the input network, as well as their correlations, in
a scalable model. Here, we propose a novel and simple method for generating
surrogate temporal networks. Our method decomposes the input network into
star-like structures evolving in time. Then those structures are used as
building blocks to generate a surrogate temporal network. Our model vastly
outperforms current methods across multiple examples of temporal networks in
terms of both topological and dynamical similarity. We further show that beyond
generating realistic interaction patterns, our method is able to capture
intrinsic temporal periodicity of temporal networks, all with an execution time
lower than competing methods by multiple orders of magnitude. The simplicity of
our algorithm makes it easily interpretable, extendable and algorithmically
scalable.
Related papers
- State-Space Modeling in Long Sequence Processing: A Survey on Recurrence in the Transformer Era [59.279784235147254]
This survey provides an in-depth summary of the latest approaches that are based on recurrent models for sequential data processing.
The emerging picture suggests that there is room for thinking of novel routes, constituted by learning algorithms which depart from the standard Backpropagation Through Time.
arXiv Detail & Related papers (2024-06-13T12:51:22Z) - Semantic Loss Functions for Neuro-Symbolic Structured Prediction [74.18322585177832]
We discuss the semantic loss, which injects knowledge about such structure, defined symbolically, into training.
It is agnostic to the arrangement of the symbols, and depends only on the semantics expressed thereby.
It can be combined with both discriminative and generative neural models.
arXiv Detail & Related papers (2024-05-12T22:18:25Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Temporal Aggregation and Propagation Graph Neural Networks for Dynamic
Representation [67.26422477327179]
Temporal graphs exhibit dynamic interactions between nodes over continuous time.
We propose a novel method of temporal graph convolution with the whole neighborhood.
Our proposed TAP-GNN outperforms existing temporal graph methods by a large margin in terms of both predictive performance and online inference latency.
arXiv Detail & Related papers (2023-04-15T08:17:18Z) - DyCSC: Modeling the Evolutionary Process of Dynamic Networks Based on
Cluster Structure [1.005130974691351]
We propose a novel temporal network embedding method named Dynamic Cluster Structure Constraint model (DyCSC)
DyCSC captures the evolution of temporal networks by imposing a temporal constraint on the tendency of the nodes in the network to a given number of clusters.
It consistently outperforms competing methods by significant margins in multiple temporal link prediction tasks.
arXiv Detail & Related papers (2022-10-23T10:23:08Z) - Direct Embedding of Temporal Network Edges via Time-Decayed Line Graphs [51.51417735550026]
Methods for machine learning on temporal networks generally exhibit at least one of two limitations.
We present a simple method that avoids both shortcomings: construct the line graph of the network, which includes a node for each interaction, and weigh the edges of this graph based on the difference in time between interactions.
Empirical results on real-world networks demonstrate our method's efficacy and efficiency on both edge classification and temporal link prediction.
arXiv Detail & Related papers (2022-09-30T18:24:13Z) - Temporal Network Embedding via Tensor Factorization [13.490625417640658]
The embeddings of temporal networks should encode both graph-structured information and the temporally evolving pattern.
Existing approaches in learning temporally evolving network representations fail to capture the temporal interdependence.
We propose Toffee, a novel approach for temporal network representation learning based on tensor decomposition.
arXiv Detail & Related papers (2021-08-22T20:50:38Z) - PRESTO: Simple and Scalable Sampling Techniques for the Rigorous
Approximation of Temporal Motif Counts [7.025709586759655]
We present an efficient and scalable algorithm to obtain rigorous approximations of the count of temporal motifs.
Our algorithm is based on a simple but effective sampling approach, which renders our algorithm practical for very large datasets.
arXiv Detail & Related papers (2021-01-18T16:35:12Z) - Inductive Representation Learning in Temporal Networks via Causal
Anonymous Walks [51.79552974355547]
Temporal networks serve as abstractions of many real-world dynamic systems.
We propose Causal Anonymous Walks (CAWs) to inductively represent a temporal network.
CAWs are extracted by temporal random walks and work as automatic retrieval of temporal network motifs.
arXiv Detail & Related papers (2021-01-15T05:47:26Z) - EPNE: Evolutionary Pattern Preserving Network Embedding [26.06068388979255]
We propose EPNE, a temporal network embedding model preserving evolutionary patterns of the local structure of nodes.
With the adequate modeling of temporal information, our model is able to outperform other competitive methods in various prediction tasks.
arXiv Detail & Related papers (2020-09-24T06:31:14Z) - TempNodeEmb:Temporal Node Embedding considering temporal edge influence
matrix [0.8941624592392746]
Predicting future links among the nodes in temporal networks reveals an important aspect of the evolution of temporal networks.
Some approaches consider a simplified representation of temporal networks but in high-dimensional and generally sparse matrices.
We propose a new node embedding technique which exploits the evolving nature of the networks considering a simple three-layer graph neural network at each time step.
arXiv Detail & Related papers (2020-08-16T15:39:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.