Deep Dynamic Effective Connectivity Estimation from Multivariate Time
Series
- URL: http://arxiv.org/abs/2202.02393v1
- Date: Fri, 4 Feb 2022 21:14:21 GMT
- Title: Deep Dynamic Effective Connectivity Estimation from Multivariate Time
Series
- Authors: Usman Mahmood, Zening Fu, Vince Calhoun, Sergey Plis
- Abstract summary: We develop dynamic effective connectivity estimation via neural network training (DECENNT)
DECENNT outperforms state-of-the-art (SOTA) methods on five different tasks and infers interpretable task-specific dynamic graphs.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recently, methods that represent data as a graph, such as graph neural
networks (GNNs) have been successfully used to learn data representations and
structures to solve classification and link prediction problems. The
applications of such methods are vast and diverse, but most of the current work
relies on the assumption of a static graph. This assumption does not hold for
many highly dynamic systems, where the underlying connectivity structure is
non-stationary and is mostly unobserved. Using a static model in these
situations may result in sub-optimal performance. In contrast, modeling changes
in graph structure with time can provide information about the system whose
applications go beyond classification. Most work of this type does not learn
effective connectivity and focuses on cross-correlation between nodes to
generate undirected graphs. An undirected graph is unable to capture direction
of an interaction which is vital in many fields, including neuroscience. To
bridge this gap, we developed dynamic effective connectivity estimation via
neural network training (DECENNT), a novel model to learn an interpretable
directed and dynamic graph induced by the downstream classification/prediction
task. DECENNT outperforms state-of-the-art (SOTA) methods on five different
tasks and infers interpretable task-specific dynamic graphs. The dynamic graphs
inferred from functional neuroimaging data align well with the existing
literature and provide additional information. Additionally, the temporal
attention module of DECENNT identifies time-intervals crucial for predictive
downstream task from multivariate time series data.
Related papers
- Gradient Transformation: Towards Efficient and Model-Agnostic Unlearning for Dynamic Graph Neural Networks [66.70786325911124]
Graph unlearning has emerged as an essential tool for safeguarding user privacy and mitigating the negative impacts of undesirable data.
With the increasing prevalence of DGNNs, it becomes imperative to investigate the implementation of dynamic graph unlearning.
We propose an effective, efficient, model-agnostic, and post-processing method to implement DGNN unlearning.
arXiv Detail & Related papers (2024-05-23T10:26:18Z) - Node-Time Conditional Prompt Learning In Dynamic Graphs [14.62182210205324]
We propose DYGPROMPT, a novel pre-training and prompt learning framework for dynamic graph modeling.
We recognize that node and time features mutually characterize each other, and propose dual condition-nets to model the evolving node-time patterns in downstream tasks.
arXiv Detail & Related papers (2024-05-22T19:10:24Z) - TimeGraphs: Graph-based Temporal Reasoning [64.18083371645956]
TimeGraphs is a novel approach that characterizes dynamic interactions as a hierarchical temporal graph.
Our approach models the interactions using a compact graph-based representation, enabling adaptive reasoning across diverse time scales.
We evaluate TimeGraphs on multiple datasets with complex, dynamic agent interactions, including a football simulator, the Resistance game, and the MOMA human activity dataset.
arXiv Detail & Related papers (2024-01-06T06:26:49Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - TodyNet: Temporal Dynamic Graph Neural Network for Multivariate Time
Series Classification [6.76723360505692]
We propose a novel temporal dynamic neural graph network (TodyNet) that can extract hidden-temporal dependencies without undefined graph structure.
The experiments on 26 UEA benchmark datasets illustrate that the proposed TodyNet outperforms existing deep learning-based methods in the MTSC tasks.
arXiv Detail & Related papers (2023-04-11T09:21:28Z) - Dynamic Graph Message Passing Networks for Visual Recognition [112.49513303433606]
Modelling long-range dependencies is critical for scene understanding tasks in computer vision.
A fully-connected graph is beneficial for such modelling, but its computational overhead is prohibitive.
We propose a dynamic graph message passing network, that significantly reduces the computational complexity.
arXiv Detail & Related papers (2022-09-20T14:41:37Z) - Continuous Temporal Graph Networks for Event-Based Graph Data [41.786721257905555]
We propose Continuous Temporal Graph Networks (CTGNs) to capture the continuous dynamics of temporal graph data.
Key idea is to use neural ordinary differential equations (ODE) to characterize the continuous dynamics of node representations over dynamic graphs.
Experiment results on both transductive and inductive tasks demonstrate the effectiveness of our proposed approach.
arXiv Detail & Related papers (2022-05-31T16:17:02Z) - Efficient-Dyn: Dynamic Graph Representation Learning via Event-based
Temporal Sparse Attention Network [2.0047096160313456]
Dynamic graph neural networks have received more and more attention from researchers.
We propose a novel dynamic graph neural network, Efficient-Dyn.
It adaptively encodes temporal information into a sequence of patches with an equal amount of temporal-topological structure.
arXiv Detail & Related papers (2022-01-04T23:52:24Z) - Dynamic Graph Learning-Neural Network for Multivariate Time Series
Modeling [2.3022070933226217]
We propose a novel framework, namely static- and dynamic-graph learning-neural network (GL)
The model acquires static and dynamic graph matrices from data to model long-term and short-term patterns respectively.
It achieves state-of-the-art performance on almost all datasets.
arXiv Detail & Related papers (2021-12-06T08:19:15Z) - Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks [91.65637773358347]
We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
arXiv Detail & Related papers (2020-05-24T04:02:18Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.