Inferring the Graph of Networked Dynamical Systems under Partial
Observability and Spatially Colored Noise
- URL: http://arxiv.org/abs/2312.11324v1
- Date: Mon, 18 Dec 2023 16:19:07 GMT
- Title: Inferring the Graph of Networked Dynamical Systems under Partial
Observability and Spatially Colored Noise
- Authors: Augusto Santos, Diogo Rente, Rui Seabra, Jos\'e M. F. Moura
- Abstract summary: In a Networked Dynamical System (NDS), each node is a system whose dynamics are coupled with the dynamics of neighboring nodes.
The underlying network is unknown in many applications and should be inferred from observed data.
- Score: 2.362288417229025
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In a Networked Dynamical System (NDS), each node is a system whose dynamics
are coupled with the dynamics of neighboring nodes. The global dynamics
naturally builds on this network of couplings and it is often excited by a
noise input with nontrivial structure. The underlying network is unknown in
many applications and should be inferred from observed data. We assume: i)
Partial observability -- time series data is only available over a subset of
the nodes; ii) Input noise -- it is correlated across distinct nodes while
temporally independent, i.e., it is spatially colored. We present a feasibility
condition on the noise correlation structure wherein there exists a consistent
network inference estimator to recover the underlying fundamental dependencies
among the observed nodes. Further, we describe a structure identification
algorithm that exhibits competitive performance across distinct regimes of
network connectivity, observability, and noise correlation.
Related papers
- Learning the Causal Structure of Networked Dynamical Systems under
Latent Nodes and Structured Noise [2.362288417229025]
This paper considers learning the hidden causal network of a linear networked dynamical system (NDS) from the time series data at some of its nodes.
The dynamics of the NDS are driven by colored noise that generates spurious associations across pairs of nodes, rendering the problem much harder.
To address the challenge of noise correlation and partial observability, we assign to each pair of nodes a feature vector computed from the time series data of observed nodes.
arXiv Detail & Related papers (2023-12-10T19:21:33Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - Temporal Aggregation and Propagation Graph Neural Networks for Dynamic
Representation [67.26422477327179]
Temporal graphs exhibit dynamic interactions between nodes over continuous time.
We propose a novel method of temporal graph convolution with the whole neighborhood.
Our proposed TAP-GNN outperforms existing temporal graph methods by a large margin in terms of both predictive performance and online inference latency.
arXiv Detail & Related papers (2023-04-15T08:17:18Z) - Uncovering the Origins of Instability in Dynamical Systems: How
Attention Mechanism Can Help? [0.0]
We show that attention should be directed toward the collective behaviour of imbalanced structures and polarity-driven structural instabilities within the network.
Our study provides a proof of concept to understand why perturbing some nodes of a network may cause dramatic changes in the network dynamics.
arXiv Detail & Related papers (2022-12-19T17:16:41Z) - Recovering the Graph Underlying Networked Dynamical Systems under
Partial Observability: A Deep Learning Approach [7.209528581296429]
We study the problem of graph structure identification, i.e., of recovering the graph of dependencies among time series.
We devise a new feature vector computed from the observed time series and prove that these features are linearly separable.
We use these features to train Convolutional Neural Networks (CNNs)
arXiv Detail & Related papers (2022-08-08T20:32:28Z) - Bayesian Inference of Stochastic Dynamical Networks [0.0]
This paper presents a novel method for learning network topology and internal dynamics.
It is compared with group sparse Bayesian learning (GSBL), BINGO, kernel-based methods, dynGENIE3, GENIE3 and ARNI.
Our method achieves state-of-the-art performance compared with group sparse Bayesian learning (GSBL), BINGO, kernel-based methods, dynGENIE3, GENIE3 and ARNI.
arXiv Detail & Related papers (2022-06-02T03:22:34Z) - Reinforced Neighborhood Selection Guided Multi-Relational Graph Neural
Networks [68.9026534589483]
RioGNN is a novel Reinforced, recursive and flexible neighborhood selection guided multi-relational Graph Neural Network architecture.
RioGNN can learn more discriminative node embedding with enhanced explainability due to the recognition of individual importance of each relation.
arXiv Detail & Related papers (2021-04-16T04:30:06Z) - Node Similarity Preserving Graph Convolutional Networks [51.520749924844054]
Graph Neural Networks (GNNs) explore the graph structure and node features by aggregating and transforming information within node neighborhoods.
We propose SimP-GCN that can effectively and efficiently preserve node similarity while exploiting graph structure.
We validate the effectiveness of SimP-GCN on seven benchmark datasets including three assortative and four disassorative graphs.
arXiv Detail & Related papers (2020-11-19T04:18:01Z) - Graph Neural Networks with Composite Kernels [60.81504431653264]
We re-interpret node aggregation from the perspective of kernel weighting.
We present a framework to consider feature similarity in an aggregation scheme.
We propose feature aggregation as the composition of the original neighbor-based kernel and a learnable kernel to encode feature similarities in a feature space.
arXiv Detail & Related papers (2020-05-16T04:44:29Z) - Feature-Attention Graph Convolutional Networks for Noise Resilient
Learning [20.059242373860013]
We propose FA-GCN, a feature-attention graph convolution learning framework, to handle networks with noisy and sparse node content.
Experiments and validations, w.r.t. different noise levels, demonstrate that FA-GCN achieves better performance than state-of-the-art methods on both noise-free and noisy networks.
arXiv Detail & Related papers (2019-12-26T02:51:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.