Learning Continuous Network Emerging Dynamics from Scarce Observations
via Data-Adaptive Stochastic Processes
- URL: http://arxiv.org/abs/2310.16466v1
- Date: Wed, 25 Oct 2023 08:44:05 GMT
- Title: Learning Continuous Network Emerging Dynamics from Scarce Observations
via Data-Adaptive Stochastic Processes
- Authors: Jiaxu Cui, Bingyi Sun, Jiming Liu, Bo Yang
- Abstract summary: We introduce ODE Processes for Network Dynamics (NDP4ND), a new class of processes governed by data-adaptive network dynamics.
We show that the proposed method has excellent data and computational efficiency, and can adapt to unseen network emerging dynamics.
- Score: 11.494631894700253
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Learning network dynamics from the empirical structure and spatio-temporal
observation data is crucial to revealing the interaction mechanisms of complex
networks in a wide range of domains. However, most existing methods only aim at
learning network dynamic behaviors generated by a specific ordinary
differential equation instance, resulting in ineffectiveness for new ones, and
generally require dense observations. The observed data, especially from
network emerging dynamics, are usually difficult to obtain, which brings
trouble to model learning. Therefore, how to learn accurate network dynamics
with sparse, irregularly-sampled, partial, and noisy observations remains a
fundamental challenge. We introduce Neural ODE Processes for Network Dynamics
(NDP4ND), a new class of stochastic processes governed by stochastic
data-adaptive network dynamics, to overcome the challenge and learn continuous
network dynamics from scarce observations. Intensive experiments conducted on
various network dynamics in ecological population evolution, phototaxis
movement, brain activity, epidemic spreading, and real-world empirical systems,
demonstrate that the proposed method has excellent data adaptability and
computational efficiency, and can adapt to unseen network emerging dynamics,
producing accurate interpolation and extrapolation with reducing the ratio of
required observation data to only about 6\% and improving the learning speed
for new dynamics by three orders of magnitude.
Related papers
- Dynamical stability and chaos in artificial neural network trajectories along training [3.379574469735166]
We study the dynamical properties of this process by analyzing through this lens the network trajectories of a shallow neural network.
We find hints of regular and chaotic behavior depending on the learning rate regime.
This work also contributes to the cross-fertilization of ideas between dynamical systems theory, network theory and machine learning.
arXiv Detail & Related papers (2024-04-08T17:33:11Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Do We Need an Encoder-Decoder to Model Dynamical Systems on Networks? [18.92828441607381]
We show that embeddings induce a model that fits observations well but simultaneously has incorrect dynamical behaviours.
We propose a simple embedding-free alternative based on parametrising two additive vector-field components.
arXiv Detail & Related papers (2023-05-20T12:41:47Z) - Decomposed Linear Dynamical Systems (dLDS) for learning the latent
components of neural dynamics [6.829711787905569]
We propose a new decomposed dynamical system model that represents complex non-stationary and nonlinear dynamics of time series data.
Our model is trained through a dictionary learning procedure, where we leverage recent results in tracking sparse vectors over time.
In both continuous-time and discrete-time instructional examples we demonstrate that our model can well approximate the original system.
arXiv Detail & Related papers (2022-06-07T02:25:38Z) - Learning Individual Interactions from Population Dynamics with Discrete-Event Simulation Model [9.827590402695341]
We will explore the possibility of learning a discrete-event simulation representation of complex system dynamics.
Our results show that the algorithm can data-efficiently capture complex network dynamics in several fields with meaningful events.
arXiv Detail & Related papers (2022-05-04T21:33:56Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Learning Contact Dynamics using Physically Structured Neural Networks [81.73947303886753]
We use connections between deep neural networks and differential equations to design a family of deep network architectures for representing contact dynamics between objects.
We show that these networks can learn discontinuous contact events in a data-efficient manner from noisy observations.
Our results indicate that an idealised form of touch feedback is a key component of making this learning problem tractable.
arXiv Detail & Related papers (2021-02-22T17:33:51Z) - Network Diffusions via Neural Mean-Field Dynamics [52.091487866968286]
We propose a novel learning framework for inference and estimation problems of diffusion on networks.
Our framework is derived from the Mori-Zwanzig formalism to obtain an exact evolution of the node infection probabilities.
Our approach is versatile and robust to variations of the underlying diffusion network models.
arXiv Detail & Related papers (2020-06-16T18:45:20Z) - Deep learning of contagion dynamics on complex networks [0.0]
We propose a complementary approach based on deep learning to build effective models of contagion dynamics on networks.
By allowing simulations on arbitrary network structures, our approach makes it possible to explore the properties of the learned dynamics beyond the training data.
Our results demonstrate how deep learning offers a new and complementary perspective to build effective models of contagion dynamics on networks.
arXiv Detail & Related papers (2020-06-09T17:18:34Z) - The large learning rate phase of deep learning: the catapult mechanism [50.23041928811575]
We present a class of neural networks with solvable training dynamics.
We find good agreement between our model's predictions and training dynamics in realistic deep learning settings.
We believe our results shed light on characteristics of models trained at different learning rates.
arXiv Detail & Related papers (2020-03-04T17:52:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.