DynACPD Embedding Algorithm for Prediction Tasks in Dynamic Networks
- URL: http://arxiv.org/abs/2103.07080v1
- Date: Fri, 12 Mar 2021 04:36:42 GMT
- Title: DynACPD Embedding Algorithm for Prediction Tasks in Dynamic Networks
- Authors: Chris Connell and Yang Wang
- Abstract summary: We present novel embedding methods for a dynamic network based on higher order tensor decompositions for tensorial representations of the dynamic network.
We demonstrate the power and efficiency of our approach by comparing our algorithms' performance on the link prediction task against an array of current baseline methods.
- Score: 6.5361928329696335
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Classical network embeddings create a low dimensional representation of the
learned relationships between features across nodes. Such embeddings are
important for tasks such as link prediction and node classification. In the
current paper, we consider low dimensional embeddings of dynamic networks, that
is a family of time varying networks where there exist both temporal and
spatial link relationships between nodes. We present novel embedding methods
for a dynamic network based on higher order tensor decompositions for tensorial
representations of the dynamic network. In one sense, our embeddings are
analogous to spectral embedding methods for static networks. We provide a
rationale for our algorithms via a mathematical analysis of some potential
reasons for their effectiveness. Finally, we demonstrate the power and
efficiency of our approach by comparing our algorithms' performance on the link
prediction task against an array of current baseline methods across three
distinct real-world dynamic networks.
Related papers
- Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - Graph Neural Networks for Learning Equivariant Representations of Neural Networks [55.04145324152541]
We propose to represent neural networks as computational graphs of parameters.
Our approach enables a single model to encode neural computational graphs with diverse architectures.
We showcase the effectiveness of our method on a wide range of tasks, including classification and editing of implicit neural representations.
arXiv Detail & Related papers (2024-03-18T18:01:01Z) - GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - Accelerating Dynamic Network Embedding with Billions of Parameter
Updates to Milliseconds [27.98359191399847]
We propose a novel dynamic network embedding paradigm that rotates and scales the axes of embedding space instead of a node-by-node update.
Specifically, we propose the Dynamic Adjacency Matrix Factorization (DAMF) algorithm, which achieves an efficient and accurate dynamic network embedding.
Experiments of node classification, link prediction, and graph reconstruction on different-sized dynamic graphs suggest that DAMF advances dynamic network embedding.
arXiv Detail & Related papers (2023-06-15T09:02:17Z) - Backpropagation on Dynamical Networks [0.0]
We propose a network inference method based on the backpropagation through time (BPTT) algorithm commonly used to train recurrent neural networks.
An approximation of local node dynamics is first constructed using a neural network.
Freerun prediction performance with the resulting local models and weights was found to be comparable to the true system.
arXiv Detail & Related papers (2022-07-07T05:22:44Z) - Temporal Graph Network Embedding with Causal Anonymous Walks
Representations [54.05212871508062]
We propose a novel approach for dynamic network representation learning based on Temporal Graph Network.
For evaluation, we provide a benchmark pipeline for the evaluation of temporal network embeddings.
We show the applicability and superior performance of our model in the real-world downstream graph machine learning task provided by one of the top European banks.
arXiv Detail & Related papers (2021-08-19T15:39:52Z) - Learning Autonomy in Management of Wireless Random Networks [102.02142856863563]
This paper presents a machine learning strategy that tackles a distributed optimization task in a wireless network with an arbitrary number of randomly interconnected nodes.
We develop a flexible deep neural network formalism termed distributed message-passing neural network (DMPNN) with forward and backward computations independent of the network topology.
arXiv Detail & Related papers (2021-06-15T09:03:28Z) - Dynamic Graph: Learning Instance-aware Connectivity for Neural Networks [78.65792427542672]
Dynamic Graph Network (DG-Net) is a complete directed acyclic graph, where the nodes represent convolutional blocks and the edges represent connection paths.
Instead of using the same path of the network, DG-Net aggregates features dynamically in each node, which allows the network to have more representation ability.
arXiv Detail & Related papers (2020-10-02T16:50:26Z) - MODEL: Motif-based Deep Feature Learning for Link Prediction [23.12527010960999]
We propose a novel embedding algorithm that incorporates network motifs to capture higher-order structures in the network.
Experiments were conducted on three types of networks: social networks, biological networks, and academic networks.
Our algorithm outperforms both the traditional similarity-based algorithms by 20% and the state-of-the-art embedding-based algorithms by 19%.
arXiv Detail & Related papers (2020-08-09T03:39:28Z) - Link Prediction for Temporally Consistent Networks [6.981204218036187]
Link prediction estimates the next relationship in dynamic networks.
The use of adjacency matrix to represent dynamically evolving networks limits the ability to analytically learn from heterogeneous, sparse, or forming networks.
We propose a new method of canonically representing heterogeneous time-evolving activities as a temporally parameterized network model.
arXiv Detail & Related papers (2020-06-06T07:28:03Z) - Modeling Dynamic Heterogeneous Network for Link Prediction using
Hierarchical Attention with Temporal RNN [16.362525151483084]
We propose a novel dynamic heterogeneous network embedding method, termed as DyHATR.
It uses hierarchical attention to learn heterogeneous information and incorporates recurrent neural networks with temporal attention to capture evolutionary patterns.
We benchmark our method on four real-world datasets for the task of link prediction.
arXiv Detail & Related papers (2020-04-01T17:16:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.