Foundations and modelling of dynamic networks using Dynamic Graph Neural
Networks: A survey
- URL: http://arxiv.org/abs/2005.07496v2
- Date: Sun, 13 Jun 2021 07:05:05 GMT
- Title: Foundations and modelling of dynamic networks using Dynamic Graph Neural
Networks: A survey
- Authors: Joakim Skarding, Bogdan Gabrys and Katarzyna Musial
- Abstract summary: We establish a foundation of dynamic networks with consistent, detailed terminology and notation.
We present a comprehensive survey of dynamic graph neural network models using the proposed terminology.
- Score: 11.18312489268624
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Dynamic networks are used in a wide range of fields, including social network
analysis, recommender systems, and epidemiology. Representing complex networks
as structures changing over time allow network models to leverage not only
structural but also temporal patterns. However, as dynamic network literature
stems from diverse fields and makes use of inconsistent terminology, it is
challenging to navigate. Meanwhile, graph neural networks (GNNs) have gained a
lot of attention in recent years for their ability to perform well on a range
of network science tasks, such as link prediction and node classification.
Despite the popularity of graph neural networks and the proven benefits of
dynamic network models, there has been little focus on graph neural networks
for dynamic networks. To address the challenges resulting from the fact that
this research crosses diverse fields as well as to survey dynamic graph neural
networks, this work is split into two main parts. First, to address the
ambiguity of the dynamic network terminology we establish a foundation of
dynamic networks with consistent, detailed terminology and notation. Second, we
present a comprehensive survey of dynamic graph neural network models using the
proposed terminology
Related papers
- Graph Neural Networks for Learning Equivariant Representations of Neural Networks [55.04145324152541]
We propose to represent neural networks as computational graphs of parameters.
Our approach enables a single model to encode neural computational graphs with diverse architectures.
We showcase the effectiveness of our method on a wide range of tasks, including classification and editing of implicit neural representations.
arXiv Detail & Related papers (2024-03-18T18:01:01Z) - Image segmentation with traveling waves in an exactly solvable recurrent
neural network [71.74150501418039]
We show that a recurrent neural network can effectively divide an image into groups according to a scene's structural characteristics.
We present a precise description of the mechanism underlying object segmentation in this network.
We then demonstrate a simple algorithm for object segmentation that generalizes across inputs ranging from simple geometric objects in grayscale images to natural images.
arXiv Detail & Related papers (2023-11-28T16:46:44Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - On the effectiveness of neural priors in modeling dynamical systems [28.69155113611877]
We discuss the architectural regularization that neural networks offer when learning such systems.
We show that simple coordinate networks with few layers can be used to solve multiple problems in modelling dynamical systems.
arXiv Detail & Related papers (2023-03-10T06:21:24Z) - Deep Neural Networks as Complex Networks [1.704936863091649]
We use Complex Network Theory to represent Deep Neural Networks (DNNs) as directed weighted graphs.
We introduce metrics to study DNNs as dynamical systems, with a granularity that spans from weights to layers, including neurons.
We show that our metrics discriminate low vs. high performing networks.
arXiv Detail & Related papers (2022-09-12T16:26:04Z) - Characterizing Learning Dynamics of Deep Neural Networks via Complex
Networks [1.0869257688521987]
Complex Network Theory (CNT) represents Deep Neural Networks (DNNs) as directed weighted graphs to study them as dynamical systems.
We introduce metrics for nodes/neurons and layers, namely Nodes Strength and Layers Fluctuation.
Our framework distills trends in the learning dynamics and separates low from high accurate networks.
arXiv Detail & Related papers (2021-10-06T10:03:32Z) - Dynamic Network Embedding Survey [11.742863376032112]
We give a survey of dynamic network embedding in this paper.
We present two basic data models, namely, discrete model and continuous model for dynamic networks.
We build a taxonomy that refines the category hierarchy by typical learning models.
arXiv Detail & Related papers (2021-03-29T09:27:53Z) - Learning Contact Dynamics using Physically Structured Neural Networks [81.73947303886753]
We use connections between deep neural networks and differential equations to design a family of deep network architectures for representing contact dynamics between objects.
We show that these networks can learn discontinuous contact events in a data-efficient manner from noisy observations.
Our results indicate that an idealised form of touch feedback is a key component of making this learning problem tractable.
arXiv Detail & Related papers (2021-02-22T17:33:51Z) - Graph Structure of Neural Networks [104.33754950606298]
We show how the graph structure of neural networks affect their predictive performance.
A "sweet spot" of relational graphs leads to neural networks with significantly improved predictive performance.
Top-performing neural networks have graph structure surprisingly similar to those of real biological neural networks.
arXiv Detail & Related papers (2020-07-13T17:59:31Z) - Modeling Dynamic Heterogeneous Network for Link Prediction using
Hierarchical Attention with Temporal RNN [16.362525151483084]
We propose a novel dynamic heterogeneous network embedding method, termed as DyHATR.
It uses hierarchical attention to learn heterogeneous information and incorporates recurrent neural networks with temporal attention to capture evolutionary patterns.
We benchmark our method on four real-world datasets for the task of link prediction.
arXiv Detail & Related papers (2020-04-01T17:16:47Z) - Analyzing Neural Networks Based on Random Graphs [77.34726150561087]
We perform a massive evaluation of neural networks with architectures corresponding to random graphs of various types.
We find that none of the classical numerical graph invariants by itself allows to single out the best networks.
We also find that networks with primarily short-range connections perform better than networks which allow for many long-range connections.
arXiv Detail & Related papers (2020-02-19T11:04:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.