Brain dynamics via Cumulative Auto-Regressive Self-Attention
- URL: http://arxiv.org/abs/2111.01271v1
- Date: Mon, 1 Nov 2021 21:50:35 GMT
- Title: Brain dynamics via Cumulative Auto-Regressive Self-Attention
- Authors: Usman Mahmood, Zening Fu, Vince Calhoun, Sergey Plis
- Abstract summary: We present a model that is considerably shallow than deep graph neural networks (GNNs)
Our model learns the autoregressive structure of individual time series and estimates directed connectivity graphs.
We demonstrate our results on a functional neuroimaging dataset classifying schizophrenia patients and controls.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Multivariate dynamical processes can often be intuitively described by a
weighted connectivity graph between components representing each individual
time-series. Even a simple representation of this graph as a Pearson
correlation matrix may be informative and predictive as demonstrated in the
brain imaging literature. However, there is a consensus expectation that
powerful graph neural networks (GNNs) should perform better in similar
settings. In this work, we present a model that is considerably shallow than
deep GNNs, yet outperforms them in predictive accuracy in a brain imaging
application. Our model learns the autoregressive structure of individual time
series and estimates directed connectivity graphs between the learned
representations via a self-attention mechanism in an end-to-end fashion. The
supervised training of the model as a classifier between patients and controls
results in a model that generates directed connectivity graphs and highlights
the components of the time-series that are predictive for each subject. We
demonstrate our results on a functional neuroimaging dataset classifying
schizophrenia patients and controls.
Related papers
- DBGDGM: Dynamic Brain Graph Deep Generative Model [63.23390833353625]
Graphs are a natural representation of brain activity derived from functional magnetic imaging (fMRI) data.
It is well known that clusters of anatomical brain regions, known as functional connectivity networks (FCNs), encode temporal relationships which can serve as useful biomarkers for understanding brain function and dysfunction.
Previous works, however, ignore the temporal dynamics of the brain and focus on static graphs.
We propose a dynamic brain graph deep generative model (DBGDGM) which simultaneously clusters brain regions into temporally evolving communities and learns dynamic unsupervised node embeddings.
arXiv Detail & Related papers (2023-01-26T20:45:30Z) - Neural Graphical Models [2.6842860806280058]
We introduce Neural Graphical Models (NGMs) to represent complex feature dependencies with reasonable computational costs.
We capture the dependency structure between the features along with their complex function representations by using a neural network as a multi-task learning framework.
NGMs can fit generic graph structures including directed, undirected and mixed-edge graphs as well as support mixed input data types.
arXiv Detail & Related papers (2022-10-02T07:59:51Z) - DynDepNet: Learning Time-Varying Dependency Structures from fMRI Data
via Dynamic Graph Structure Learning [58.94034282469377]
We propose DynDepNet, a novel method for learning the optimal time-varying dependency structure of fMRI data induced by downstream prediction tasks.
Experiments on real-world fMRI datasets, for the task of sex classification, demonstrate that DynDepNet achieves state-of-the-art results.
arXiv Detail & Related papers (2022-09-27T16:32:11Z) - Contrastive Brain Network Learning via Hierarchical Signed Graph Pooling
Model [64.29487107585665]
Graph representation learning techniques on brain functional networks can facilitate the discovery of novel biomarkers for clinical phenotypes and neurodegenerative diseases.
Here, we propose an interpretable hierarchical signed graph representation learning model to extract graph-level representations from brain functional networks.
In order to further improve the model performance, we also propose a new strategy to augment functional brain network data for contrastive learning.
arXiv Detail & Related papers (2022-07-14T20:03:52Z) - Learning the Evolutionary and Multi-scale Graph Structure for
Multivariate Time Series Forecasting [50.901984244738806]
We show how to model the evolutionary and multi-scale interactions of time series.
In particular, we first provide a hierarchical graph structure cooperated with the dilated convolution to capture the scale-specific correlations.
A unified neural network is provided to integrate the components above to get the final prediction.
arXiv Detail & Related papers (2022-06-28T08:11:12Z) - Deep Dynamic Effective Connectivity Estimation from Multivariate Time
Series [0.0]
We develop dynamic effective connectivity estimation via neural network training (DECENNT)
DECENNT outperforms state-of-the-art (SOTA) methods on five different tasks and infers interpretable task-specific dynamic graphs.
arXiv Detail & Related papers (2022-02-04T21:14:21Z) - Dynamic Adaptive Spatio-temporal Graph Convolution for fMRI Modelling [0.0]
We propose a dynamic adaptivetemporal graph convolution (DASTGCN) model to overcome the shortcomings of pre-defined static correlation-based graph structures.
The proposed approach allows end-to-end inference of dynamic connections between brain regions via layer-wise graph structure learning module.
We evaluate our pipeline on the UKBiobank for age and gender classification tasks from resting-state functional scans.
arXiv Detail & Related papers (2021-09-26T07:19:47Z) - Learning Dynamic Graph Representation of Brain Connectome with
Spatio-Temporal Attention [33.049423523704824]
We propose STAGIN, a method for learning dynamic graph representation of the brain connectome with temporal attention.
Experiments on the HCP-Rest and the HCP-Task datasets demonstrate exceptional performance of our proposed method.
arXiv Detail & Related papers (2021-05-27T23:06:50Z) - TCL: Transformer-based Dynamic Graph Modelling via Contrastive Learning [87.38675639186405]
We propose a novel graph neural network approach, called TCL, which deals with the dynamically-evolving graph in a continuous-time fashion.
To the best of our knowledge, this is the first attempt to apply contrastive learning to representation learning on dynamic graphs.
arXiv Detail & Related papers (2021-05-17T15:33:25Z) - Graph Structure of Neural Networks [104.33754950606298]
We show how the graph structure of neural networks affect their predictive performance.
A "sweet spot" of relational graphs leads to neural networks with significantly improved predictive performance.
Top-performing neural networks have graph structure surprisingly similar to those of real biological neural networks.
arXiv Detail & Related papers (2020-07-13T17:59:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.