Detecting structural perturbations from time series with deep learning
- URL: http://arxiv.org/abs/2006.05232v1
- Date: Tue, 9 Jun 2020 13:08:40 GMT
- Title: Detecting structural perturbations from time series with deep learning
- Authors: Edward Laurence, Charles Murphy, Guillaume St-Onge, Xavier
Roy-Pomerleau, and Vincent Thibeault
- Abstract summary: We present a graph neural network approach to infer structural perturbations from functional time series.
We show our data-driven approach outperforms typical reconstruction methods.
This work uncovers a practical avenue to study the resilience of real-world complex systems.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Small disturbances can trigger functional breakdowns in complex systems. A
challenging task is to infer the structural cause of a disturbance in a
networked system, soon enough to prevent a catastrophe. We present a graph
neural network approach, borrowed from the deep learning paradigm, to infer
structural perturbations from functional time series. We show our data-driven
approach outperforms typical reconstruction methods while meeting the accuracy
of Bayesian inference. We validate the versatility and performance of our
approach with epidemic spreading, population dynamics, and neural dynamics, on
various network structures: random networks, scale-free networks, 25 real
food-web systems, and the C. Elegans connectome. Moreover, we report that our
approach is robust to data corruption. This work uncovers a practical avenue to
study the resilience of real-world complex systems.
Related papers
- Joint trajectory and network inference via reference fitting [0.0]
We propose an approach for leveraging both dynamical and perturbational single cell data to jointly learn cellular trajectories and power network inference.
Our approach is motivated by min-entropy estimation for dynamics and can infer directed and signed networks from time-stamped single cell snapshots.
arXiv Detail & Related papers (2024-09-10T21:49:57Z) - TDNetGen: Empowering Complex Network Resilience Prediction with Generative Augmentation of Topology and Dynamics [14.25304439234864]
We introduce a novel resilience prediction framework for complex networks, designed to tackle this issue through generative data augmentation of network topology and dynamics.
Experiment results on three network datasets demonstrate that our proposed framework TDNetGen can achieve high prediction accuracy up to 85%-95%.
arXiv Detail & Related papers (2024-08-19T09:20:31Z) - Disentangling the Causes of Plasticity Loss in Neural Networks [55.23250269007988]
We show that loss of plasticity can be decomposed into multiple independent mechanisms.
We show that a combination of layer normalization and weight decay is highly effective at maintaining plasticity in a variety of synthetic nonstationary learning tasks.
arXiv Detail & Related papers (2024-02-29T00:02:33Z) - Critical Learning Periods for Multisensory Integration in Deep Networks [112.40005682521638]
We show that the ability of a neural network to integrate information from diverse sources hinges critically on being exposed to properly correlated signals during the early phases of training.
We show that critical periods arise from the complex and unstable early transient dynamics, which are decisive of final performance of the trained system and their learned representations.
arXiv Detail & Related papers (2022-10-06T23:50:38Z) - The Neural Race Reduction: Dynamics of Abstraction in Gated Networks [12.130628846129973]
We introduce the Gated Deep Linear Network framework that schematizes how pathways of information flow impact learning dynamics.
We derive an exact reduction and, for certain cases, exact solutions to the dynamics of learning.
Our work gives rise to general hypotheses relating neural architecture to learning and provides a mathematical approach towards understanding the design of more complex architectures.
arXiv Detail & Related papers (2022-07-21T12:01:03Z) - Leveraging the structure of dynamical systems for data-driven modeling [111.45324708884813]
We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
arXiv Detail & Related papers (2021-12-15T20:09:20Z) - Anomaly Detection on Attributed Networks via Contrastive Self-Supervised
Learning [50.24174211654775]
We present a novel contrastive self-supervised learning framework for anomaly detection on attributed networks.
Our framework fully exploits the local information from network data by sampling a novel type of contrastive instance pair.
A graph neural network-based contrastive learning model is proposed to learn informative embedding from high-dimensional attributes and local structure.
arXiv Detail & Related papers (2021-02-27T03:17:20Z) - Learning Contact Dynamics using Physically Structured Neural Networks [81.73947303886753]
We use connections between deep neural networks and differential equations to design a family of deep network architectures for representing contact dynamics between objects.
We show that these networks can learn discontinuous contact events in a data-efficient manner from noisy observations.
Our results indicate that an idealised form of touch feedback is a key component of making this learning problem tractable.
arXiv Detail & Related papers (2021-02-22T17:33:51Z) - Network Diffusions via Neural Mean-Field Dynamics [52.091487866968286]
We propose a novel learning framework for inference and estimation problems of diffusion on networks.
Our framework is derived from the Mori-Zwanzig formalism to obtain an exact evolution of the node infection probabilities.
Our approach is versatile and robust to variations of the underlying diffusion network models.
arXiv Detail & Related papers (2020-06-16T18:45:20Z) - Deep learning of contagion dynamics on complex networks [0.0]
We propose a complementary approach based on deep learning to build effective models of contagion dynamics on networks.
By allowing simulations on arbitrary network structures, our approach makes it possible to explore the properties of the learned dynamics beyond the training data.
Our results demonstrate how deep learning offers a new and complementary perspective to build effective models of contagion dynamics on networks.
arXiv Detail & Related papers (2020-06-09T17:18:34Z) - Inference for Network Structure and Dynamics from Time Series Data via
Graph Neural Network [21.047133113979083]
We propose a novel data-driven deep learning model called Gumbel Graph Network (GGN) to solve the two kinds of network inference problems: Network Reconstruction and Network Completion.
Our method can reconstruct up to 100% network structure on the network reconstruction task.
While the model can also infer the unknown parts of the structure with up to 90% accuracy when some nodes are missing.
arXiv Detail & Related papers (2020-01-18T02:05:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.