Link Prediction for Temporally Consistent Networks
- URL: http://arxiv.org/abs/2006.03804v1
- Date: Sat, 6 Jun 2020 07:28:03 GMT
- Title: Link Prediction for Temporally Consistent Networks
- Authors: Mohamoud Ali, Yugyung Lee and Praveen Rao
- Abstract summary: Link prediction estimates the next relationship in dynamic networks.
The use of adjacency matrix to represent dynamically evolving networks limits the ability to analytically learn from heterogeneous, sparse, or forming networks.
We propose a new method of canonically representing heterogeneous time-evolving activities as a temporally parameterized network model.
- Score: 6.981204218036187
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Dynamic networks have intrinsic structural, computational, and
multidisciplinary advantages. Link prediction estimates the next relationship
in dynamic networks. However, in the current link prediction approaches, only
bipartite or non-bipartite but homogeneous networks are considered. The use of
adjacency matrix to represent dynamically evolving networks limits the ability
to analytically learn from heterogeneous, sparse, or forming networks. In the
case of a heterogeneous network, modeling all network states using a
binary-valued matrix can be difficult. On the other hand, sparse or currently
forming networks have many missing edges, which are represented as zeros, thus
introducing class imbalance or noise. We propose a time-parameterized matrix
(TP-matrix) and empirically demonstrate its effectiveness in non-bipartite,
heterogeneous networks. In addition, we propose a predictive influence index as
a measure of a node's boosting or diminishing predictive influence using
backward and forward-looking maximization over the temporal space of the
n-degree neighborhood. We further propose a new method of canonically
representing heterogeneous time-evolving activities as a temporally
parameterized network model (TPNM). The new method robustly enables activities
to be represented as a form of a network, thus potentially inspiring new link
prediction applications, including intelligent business process management
systems and context-aware workflow engines. We evaluated our model on four
datasets of different network systems. We present results that show the
proposed model is more effective in capturing and retaining temporal
relationships in dynamically evolving networks. We also show that our model
performed better than state-of-the-art link prediction benchmark results for
networks that are sensitive to temporal evolution.
Related papers
- Contrastive Representation Learning for Dynamic Link Prediction in Temporal Networks [1.9389881806157312]
We introduce a self-supervised method for learning representations of temporal networks.
We propose a recurrent message-passing neural network architecture for modeling the information flow over time-respecting paths of temporal networks.
The proposed method is tested on Enron, COLAB, and Facebook datasets.
arXiv Detail & Related papers (2024-08-22T22:50:46Z) - Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Piecewise-Velocity Model for Learning Continuous-time Dynamic Node
Representations [0.0]
Piecewise-Veable Model (PiVeM) for representation of continuous-time dynamic networks.
We show that PiVeM can successfully represent network structure and dynamics in ultra-low two-dimensional spaces.
It outperforms relevant state-of-art methods in downstream tasks such as link prediction.
arXiv Detail & Related papers (2022-12-23T13:57:56Z) - Link Prediction with Contextualized Self-Supervision [63.25455976593081]
Link prediction aims to infer the existence of a link between two nodes in a network.
Traditional link prediction algorithms are hindered by three major challenges -- link sparsity, node attribute noise and network dynamics.
We propose a Contextualized Self-Supervised Learning framework that fully exploits structural context prediction for link prediction.
arXiv Detail & Related papers (2022-01-25T03:12:32Z) - Mitigating Performance Saturation in Neural Marked Point Processes:
Architectures and Loss Functions [50.674773358075015]
We propose a simple graph-based network structure called GCHP, which utilizes only graph convolutional layers.
We show that GCHP can significantly reduce training time and the likelihood ratio loss with interarrival time probability assumptions can greatly improve the model performance.
arXiv Detail & Related papers (2021-07-07T16:59:14Z) - DynACPD Embedding Algorithm for Prediction Tasks in Dynamic Networks [6.5361928329696335]
We present novel embedding methods for a dynamic network based on higher order tensor decompositions for tensorial representations of the dynamic network.
We demonstrate the power and efficiency of our approach by comparing our algorithms' performance on the link prediction task against an array of current baseline methods.
arXiv Detail & Related papers (2021-03-12T04:36:42Z) - TempNodeEmb:Temporal Node Embedding considering temporal edge influence
matrix [0.8941624592392746]
Predicting future links among the nodes in temporal networks reveals an important aspect of the evolution of temporal networks.
Some approaches consider a simplified representation of temporal networks but in high-dimensional and generally sparse matrices.
We propose a new node embedding technique which exploits the evolving nature of the networks considering a simple three-layer graph neural network at each time step.
arXiv Detail & Related papers (2020-08-16T15:39:07Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z) - Modeling Dynamic Heterogeneous Network for Link Prediction using
Hierarchical Attention with Temporal RNN [16.362525151483084]
We propose a novel dynamic heterogeneous network embedding method, termed as DyHATR.
It uses hierarchical attention to learn heterogeneous information and incorporates recurrent neural networks with temporal attention to capture evolutionary patterns.
We benchmark our method on four real-world datasets for the task of link prediction.
arXiv Detail & Related papers (2020-04-01T17:16:47Z) - Dynamic Hierarchical Mimicking Towards Consistent Optimization
Objectives [73.15276998621582]
We propose a generic feature learning mechanism to advance CNN training with enhanced generalization ability.
Partially inspired by DSN, we fork delicately designed side branches from the intermediate layers of a given neural network.
Experiments on both category and instance recognition tasks demonstrate the substantial improvements of our proposed method.
arXiv Detail & Related papers (2020-03-24T09:56:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.