Inference for Network Structure and Dynamics from Time Series Data via
Graph Neural Network
- URL: http://arxiv.org/abs/2001.06576v1
- Date: Sat, 18 Jan 2020 02:05:54 GMT
- Title: Inference for Network Structure and Dynamics from Time Series Data via
Graph Neural Network
- Authors: Mengyuan Chen, Jiang Zhang, Zhang Zhang, Lun Du, Qiao Hu, Shuo Wang,
Jiaqi Zhu
- Abstract summary: We propose a novel data-driven deep learning model called Gumbel Graph Network (GGN) to solve the two kinds of network inference problems: Network Reconstruction and Network Completion.
Our method can reconstruct up to 100% network structure on the network reconstruction task.
While the model can also infer the unknown parts of the structure with up to 90% accuracy when some nodes are missing.
- Score: 21.047133113979083
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Network structures in various backgrounds play important roles in social,
technological, and biological systems. However, the observable network
structures in real cases are often incomplete or unavailable due to measurement
errors or private protection issues. Therefore, inferring the complete network
structure is useful for understanding complex systems. The existing studies
have not fully solved the problem of inferring network structure with partial
or no information about connections or nodes. In this paper, we tackle the
problem by utilizing time series data generated by network dynamics. We regard
the network inference problem based on dynamical time series data as a problem
of minimizing errors for predicting future states and proposed a novel
data-driven deep learning model called Gumbel Graph Network (GGN) to solve the
two kinds of network inference problems: Network Reconstruction and Network
Completion. For the network reconstruction problem, the GGN framework includes
two modules: the dynamics learner and the network generator. For the network
completion problem, GGN adds a new module called the States Learner to infer
missing parts of the network. We carried out experiments on discrete and
continuous time series data. The experiments show that our method can
reconstruct up to 100% network structure on the network reconstruction task.
While the model can also infer the unknown parts of the structure with up to
90% accuracy when some nodes are missing. And the accuracy decays with the
increase of the fractions of missing nodes. Our framework may have wide
application areas where the network structure is hard to obtained and the time
series data is rich.
Related papers
- Generalization emerges from local optimization in a self-organized learning network [0.0]
We design and analyze a new paradigm for building supervised learning networks, driven only by local optimization rules without relying on a global error function.
Our network stores new knowledge in the nodes accurately and instantaneously, in the form of a lookup table.
We show on numerous examples of classification tasks that the networks generated by our algorithm systematically reach such a state of perfect generalization when the number of learned examples becomes sufficiently large.
We report on the dynamics of the change of state and show that it is abrupt and has the distinctive characteristics of a first order phase transition, a phenomenon already observed for traditional learning networks and known as grokking.
arXiv Detail & Related papers (2024-10-03T15:32:08Z) - Formal Verification of Graph Convolutional Networks with Uncertain Node Features and Uncertain Graph Structure [7.133681867718039]
Graph neural networks are becoming increasingly popular in the field of machine learning.
They have been applied in safety-critical environments where perturbations inherently occur.
This research addresses the non-passing gap by preserving the dependencies of all elements in the underlying computations.
arXiv Detail & Related papers (2024-04-23T14:12:48Z) - Riemannian Residual Neural Networks [58.925132597945634]
We show how to extend the residual neural network (ResNet)
ResNets have become ubiquitous in machine learning due to their beneficial learning properties, excellent empirical results, and easy-to-incorporate nature when building varied neural networks.
arXiv Detail & Related papers (2023-10-16T02:12:32Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Recovering the Graph Underlying Networked Dynamical Systems under
Partial Observability: A Deep Learning Approach [7.209528581296429]
We study the problem of graph structure identification, i.e., of recovering the graph of dependencies among time series.
We devise a new feature vector computed from the observed time series and prove that these features are linearly separable.
We use these features to train Convolutional Neural Networks (CNNs)
arXiv Detail & Related papers (2022-08-08T20:32:28Z) - Anomaly Detection on Attributed Networks via Contrastive Self-Supervised
Learning [50.24174211654775]
We present a novel contrastive self-supervised learning framework for anomaly detection on attributed networks.
Our framework fully exploits the local information from network data by sampling a novel type of contrastive instance pair.
A graph neural network-based contrastive learning model is proposed to learn informative embedding from high-dimensional attributes and local structure.
arXiv Detail & Related papers (2021-02-27T03:17:20Z) - Spatio-Temporal Inception Graph Convolutional Networks for
Skeleton-Based Action Recognition [126.51241919472356]
We design a simple and highly modularized graph convolutional network architecture for skeleton-based action recognition.
Our network is constructed by repeating a building block that aggregates multi-granularity information from both the spatial and temporal paths.
arXiv Detail & Related papers (2020-11-26T14:43:04Z) - Graph Prototypical Networks for Few-shot Learning on Attributed Networks [72.31180045017835]
We propose a graph meta-learning framework -- Graph Prototypical Networks (GPN)
GPN is able to perform textitmeta-learning on an attributed network and derive a highly generalizable model for handling the target classification task.
arXiv Detail & Related papers (2020-06-23T04:13:23Z) - Neural networks adapting to datasets: learning network size and topology [77.34726150561087]
We introduce a flexible setup allowing for a neural network to learn both its size and topology during the course of a gradient-based training.
The resulting network has the structure of a graph tailored to the particular learning task and dataset.
arXiv Detail & Related papers (2020-06-22T12:46:44Z) - Detecting structural perturbations from time series with deep learning [0.0]
We present a graph neural network approach to infer structural perturbations from functional time series.
We show our data-driven approach outperforms typical reconstruction methods.
This work uncovers a practical avenue to study the resilience of real-world complex systems.
arXiv Detail & Related papers (2020-06-09T13:08:40Z) - Temporal Network Representation Learning via Historical Neighborhoods
Aggregation [28.397309507168128]
We propose the Embedding via Historical Neighborhoods Aggregation (EHNA) algorithm.
We first propose a temporal random walk that can identify relevant nodes in historical neighborhoods.
Then we apply a deep learning model which uses a custom attention mechanism to induce node embeddings.
arXiv Detail & Related papers (2020-03-30T04:18:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.