EPNE: Evolutionary Pattern Preserving Network Embedding
- URL: http://arxiv.org/abs/2009.11510v1
- Date: Thu, 24 Sep 2020 06:31:14 GMT
- Title: EPNE: Evolutionary Pattern Preserving Network Embedding
- Authors: Junshan Wang, Yilun Jin, Guojie Song, Xiaojun Ma
- Abstract summary: We propose EPNE, a temporal network embedding model preserving evolutionary patterns of the local structure of nodes.
With the adequate modeling of temporal information, our model is able to outperform other competitive methods in various prediction tasks.
- Score: 26.06068388979255
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Information networks are ubiquitous and are ideal for modeling relational
data. Networks being sparse and irregular, network embedding algorithms have
caught the attention of many researchers, who came up with numerous embeddings
algorithms in static networks. Yet in real life, networks constantly evolve
over time. Hence, evolutionary patterns, namely how nodes develop itself over
time, would serve as a powerful complement to static structures in embedding
networks, on which relatively few works focus. In this paper, we propose EPNE,
a temporal network embedding model preserving evolutionary patterns of the
local structure of nodes. In particular, we analyze evolutionary patterns with
and without periodicity and design strategies correspondingly to model such
patterns in time-frequency domains based on causal convolutions. In addition,
we propose a temporal objective function which is optimized simultaneously with
proximity ones such that both temporal and structural information are
preserved. With the adequate modeling of temporal information, our model is
able to outperform other competitive methods in various prediction tasks.
Related papers
- How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Disentangling Structured Components: Towards Adaptive, Interpretable and
Scalable Time Series Forecasting [52.47493322446537]
We develop a adaptive, interpretable and scalable forecasting framework, which seeks to individually model each component of the spatial-temporal patterns.
SCNN works with a pre-defined generative process of MTS, which arithmetically characterizes the latent structure of the spatial-temporal patterns.
Extensive experiments are conducted to demonstrate that SCNN can achieve superior performance over state-of-the-art models on three real-world datasets.
arXiv Detail & Related papers (2023-05-22T13:39:44Z) - PDSketch: Integrated Planning Domain Programming and Learning [86.07442931141637]
We present a new domain definition language, named PDSketch.
It allows users to flexibly define high-level structures in the transition models.
Details of the transition model will be filled in by trainable neural networks.
arXiv Detail & Related papers (2023-03-09T18:54:12Z) - DyCSC: Modeling the Evolutionary Process of Dynamic Networks Based on
Cluster Structure [1.005130974691351]
We propose a novel temporal network embedding method named Dynamic Cluster Structure Constraint model (DyCSC)
DyCSC captures the evolution of temporal networks by imposing a temporal constraint on the tendency of the nodes in the network to a given number of clusters.
It consistently outperforms competing methods by significant margins in multiple temporal link prediction tasks.
arXiv Detail & Related papers (2022-10-23T10:23:08Z) - Graph-Survival: A Survival Analysis Framework for Machine Learning on
Temporal Networks [14.430635608400982]
We propose a framework for designing generative models for continuous time temporal networks.
We propose a fitting method for models within this framework, and an algorithm for simulating new temporal networks having desired properties.
arXiv Detail & Related papers (2022-03-14T16:40:57Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - PredRNN: A Recurrent Neural Network for Spatiotemporal Predictive
Learning [109.84770951839289]
We present PredRNN, a new recurrent network for learning visual dynamics from historical context.
We show that our approach obtains highly competitive results on three standard datasets.
arXiv Detail & Related papers (2021-03-17T08:28:30Z) - Radflow: A Recurrent, Aggregated, and Decomposable Model for Networks of
Time Series [77.47313102926017]
Radflow is a novel model for networks of time series that influence each other.
It embodies three key ideas: a recurrent neural network to obtain node embeddings that depend on time, the aggregation of the flow of influence from neighboring nodes with multi-head attention, and the multi-layer decomposition of time series.
We show that Radflow can learn different trends and seasonal patterns, that it is robust to missing nodes and edges, and that correlated temporal patterns among network neighbors reflect influence strength.
arXiv Detail & Related papers (2021-02-15T00:57:28Z) - TempNodeEmb:Temporal Node Embedding considering temporal edge influence
matrix [0.8941624592392746]
Predicting future links among the nodes in temporal networks reveals an important aspect of the evolution of temporal networks.
Some approaches consider a simplified representation of temporal networks but in high-dimensional and generally sparse matrices.
We propose a new node embedding technique which exploits the evolving nature of the networks considering a simple three-layer graph neural network at each time step.
arXiv Detail & Related papers (2020-08-16T15:39:07Z) - Link Prediction for Temporally Consistent Networks [6.981204218036187]
Link prediction estimates the next relationship in dynamic networks.
The use of adjacency matrix to represent dynamically evolving networks limits the ability to analytically learn from heterogeneous, sparse, or forming networks.
We propose a new method of canonically representing heterogeneous time-evolving activities as a temporally parameterized network model.
arXiv Detail & Related papers (2020-06-06T07:28:03Z) - Modeling Dynamic Heterogeneous Network for Link Prediction using
Hierarchical Attention with Temporal RNN [16.362525151483084]
We propose a novel dynamic heterogeneous network embedding method, termed as DyHATR.
It uses hierarchical attention to learn heterogeneous information and incorporates recurrent neural networks with temporal attention to capture evolutionary patterns.
We benchmark our method on four real-world datasets for the task of link prediction.
arXiv Detail & Related papers (2020-04-01T17:16:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.