On a Bernoulli Autoregression Framework for Link Discovery and
Prediction
- URL: http://arxiv.org/abs/2007.11811v1
- Date: Thu, 23 Jul 2020 05:58:22 GMT
- Title: On a Bernoulli Autoregression Framework for Link Discovery and
Prediction
- Authors: Xiaohan Yan, Avleen S. Bijral
- Abstract summary: We present a dynamic prediction framework for binary sequences that is based on a Bernoulli generalization of the auto-regressive process.
We propose a novel problem that exploits additional information via a much larger sequence of auxiliary networks.
In contrast to existing work our gradient based estimation approach is highly efficient and can scale to networks with millions of nodes.
- Score: 1.9290392443571387
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a dynamic prediction framework for binary sequences that is based
on a Bernoulli generalization of the auto-regressive process. Our approach
lends itself easily to variants of the standard link prediction problem for a
sequence of time dependent networks. Focusing on this dynamic network link
prediction/recommendation task, we propose a novel problem that exploits
additional information via a much larger sequence of auxiliary networks and has
important real-world relevance. To allow discovery of links that do not exist
in the available data, our model estimation framework introduces a
regularization term that presents a trade-off between the conventional link
prediction and this discovery task. In contrast to existing work our stochastic
gradient based estimation approach is highly efficient and can scale to
networks with millions of nodes. We show extensive empirical results on both
actual product-usage based time dependent networks and also present results on
a Reddit based data set of time dependent sentiment sequences.
Related papers
- Contrastive Representation Learning for Dynamic Link Prediction in Temporal Networks [1.9389881806157312]
We introduce a self-supervised method for learning representations of temporal networks.
We propose a recurrent message-passing neural network architecture for modeling the information flow over time-respecting paths of temporal networks.
The proposed method is tested on Enron, COLAB, and Facebook datasets.
arXiv Detail & Related papers (2024-08-22T22:50:46Z) - Temporal Link Prediction Using Graph Embedding Dynamics [0.0]
Temporal link prediction in dynamic networks is of particular interest due to its potential for solving complex scientific and real-world problems.
Traditional approaches to temporal link prediction have focused on finding the aggregation of dynamics of the network as a unified output.
We propose a novel perspective on temporal link prediction by defining nodes as Newtonian objects and incorporating the concept of velocity to predict network dynamics.
arXiv Detail & Related papers (2024-01-15T07:35:29Z) - An Adaptive Framework for Generalizing Network Traffic Prediction
towards Uncertain Environments [51.99765487172328]
We have developed a new framework using time-series analysis for dynamically assigning mobile network traffic prediction models.
Our framework employs learned behaviors, outperforming any single model with over a 50% improvement relative to current studies.
arXiv Detail & Related papers (2023-11-30T18:58:38Z) - Adaptive Pseudo-Siamese Policy Network for Temporal Knowledge Prediction [37.36680021388575]
We propose a novel adaptive pseudo-siamese policy network for temporal knowledge prediction based on reinforcement learning.
In sub-policy network I, the agent searches for the answer for the query along the entity-relation paths to capture the static evolutionary patterns.
In sub-policy network II, the agent searches for the answer for the query along the relation-time paths to deal with unseen entities.
arXiv Detail & Related papers (2022-04-26T02:17:39Z) - Interpretable Social Anchors for Human Trajectory Forecasting in Crowds [84.20437268671733]
We propose a neural network-based system to predict human trajectory in crowds.
We learn interpretable rule-based intents, and then utilise the expressibility of neural networks to model scene-specific residual.
Our architecture is tested on the interaction-centric benchmark TrajNet++.
arXiv Detail & Related papers (2021-05-07T09:22:34Z) - Radflow: A Recurrent, Aggregated, and Decomposable Model for Networks of
Time Series [77.47313102926017]
Radflow is a novel model for networks of time series that influence each other.
It embodies three key ideas: a recurrent neural network to obtain node embeddings that depend on time, the aggregation of the flow of influence from neighboring nodes with multi-head attention, and the multi-layer decomposition of time series.
We show that Radflow can learn different trends and seasonal patterns, that it is robust to missing nodes and edges, and that correlated temporal patterns among network neighbors reflect influence strength.
arXiv Detail & Related papers (2021-02-15T00:57:28Z) - Predicting Temporal Sets with Deep Neural Networks [50.53727580527024]
We propose an integrated solution based on the deep neural networks for temporal sets prediction.
A unique perspective is to learn element relationship by constructing set-level co-occurrence graph.
We design an attention-based module to adaptively learn the temporal dependency of elements and sets.
arXiv Detail & Related papers (2020-06-20T03:29:02Z) - Link Prediction for Temporally Consistent Networks [6.981204218036187]
Link prediction estimates the next relationship in dynamic networks.
The use of adjacency matrix to represent dynamically evolving networks limits the ability to analytically learn from heterogeneous, sparse, or forming networks.
We propose a new method of canonically representing heterogeneous time-evolving activities as a temporally parameterized network model.
arXiv Detail & Related papers (2020-06-06T07:28:03Z) - A machine learning approach for forecasting hierarchical time series [4.157415305926584]
We propose a machine learning approach for forecasting hierarchical time series.
Forecast reconciliation is the process of adjusting forecasts to make them coherent across the hierarchy.
We exploit the ability of a deep neural network to extract information capturing the structure of the hierarchy.
arXiv Detail & Related papers (2020-05-31T22:26:16Z) - Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks [91.65637773358347]
We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
arXiv Detail & Related papers (2020-05-24T04:02:18Z) - Forecasting Sequential Data using Consistent Koopman Autoencoders [52.209416711500005]
A new class of physics-based methods related to Koopman theory has been introduced, offering an alternative for processing nonlinear dynamical systems.
We propose a novel Consistent Koopman Autoencoder model which, unlike the majority of existing work, leverages the forward and backward dynamics.
Key to our approach is a new analysis which explores the interplay between consistent dynamics and their associated Koopman operators.
arXiv Detail & Related papers (2020-03-04T18:24:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.