A Graph Neural Network Framework for Causal Inference in Brain Networks
- URL: http://arxiv.org/abs/2010.07143v1
- Date: Wed, 14 Oct 2020 15:01:21 GMT
- Title: A Graph Neural Network Framework for Causal Inference in Brain Networks
- Authors: Simon Wein, Wilhelm Malloni, Ana Maria Tom\'e, Sebastian M. Frank,
Gina-Isabelle Henze, Stefan W\"ust, Mark W. Greenlee, Elmar W. Lang
- Abstract summary: A central question in neuroscience is how self-organizing dynamic interactions in the brain emerge on their relatively static backbone.
We present a graph neural network (GNN) framework to describe functional interactions based on structural anatomical layout.
We show that GNNs are able to capture long-term dependencies in data and also scale up to the analysis of large-scale networks.
- Score: 0.3392372796177108
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A central question in neuroscience is how self-organizing dynamic
interactions in the brain emerge on their relatively static structural
backbone. Due to the complexity of spatial and temporal dependencies between
different brain areas, fully comprehending the interplay between structure and
function is still challenging and an area of intense research. In this paper we
present a graph neural network (GNN) framework, to describe functional
interactions based on the structural anatomical layout. A GNN allows us to
process graph-structured spatio-temporal signals, providing a possibility to
combine structural information derived from diffusion tensor imaging (DTI) with
temporal neural activity profiles, like observed in functional magnetic
resonance imaging (fMRI). Moreover, dynamic interactions between different
brain regions learned by this data-driven approach can provide a multi-modal
measure of causal connectivity strength. We assess the proposed model's
accuracy by evaluating its capabilities to replicate empirically observed
neural activation profiles, and compare the performance to those of a vector
auto regression (VAR), like typically used in Granger causality. We show that
GNNs are able to capture long-term dependencies in data and also
computationally scale up to the analysis of large-scale networks. Finally we
confirm that features learned by a GNN can generalize across MRI scanner types
and acquisition protocols, by demonstrating that the performance on small
datasets can be improved by pre-training the GNN on data from an earlier and
different study. We conclude that the proposed multi-modal GNN framework can
provide a novel perspective on the structure-function relationship in the
brain. Therewith this approach can be promising for the characterization of the
information flow in brain networks.
Related papers
- Interpretable Spatio-Temporal Embedding for Brain Structural-Effective Network with Ordinary Differential Equation [56.34634121544929]
In this study, we first construct the brain-effective network via the dynamic causal model.
We then introduce an interpretable graph learning framework termed Spatio-Temporal Embedding ODE (STE-ODE)
This framework incorporates specifically designed directed node embedding layers, aiming at capturing the dynamic interplay between structural and effective networks.
arXiv Detail & Related papers (2024-05-21T20:37:07Z) - JGAT: a joint spatio-temporal graph attention model for brain decoding [8.844033583141039]
Joint kernel Graph Attention Network (JGAT) is a new multi-modal temporal graph attention network framework.
It integrates the data from functional Magnetic Resonance Images (fMRI) and Diffusion Weighted Imaging (DWI) while preserving the dynamic information.
We conduct brain-decoding tasks with our JGAT on four independent datasets.
arXiv Detail & Related papers (2023-06-03T02:45:03Z) - Transferability of coVariance Neural Networks and Application to
Interpretable Brain Age Prediction using Anatomical Features [119.45320143101381]
Graph convolutional networks (GCN) leverage topology-driven graph convolutional operations to combine information across the graph for inference tasks.
We have studied GCNs with covariance matrices as graphs in the form of coVariance neural networks (VNNs)
VNNs inherit the scale-free data processing architecture from GCNs and here, we show that VNNs exhibit transferability of performance over datasets whose covariance matrices converge to a limit object.
arXiv Detail & Related papers (2023-05-02T22:15:54Z) - DBGDGM: Dynamic Brain Graph Deep Generative Model [63.23390833353625]
Graphs are a natural representation of brain activity derived from functional magnetic imaging (fMRI) data.
It is well known that clusters of anatomical brain regions, known as functional connectivity networks (FCNs), encode temporal relationships which can serve as useful biomarkers for understanding brain function and dysfunction.
Previous works, however, ignore the temporal dynamics of the brain and focus on static graphs.
We propose a dynamic brain graph deep generative model (DBGDGM) which simultaneously clusters brain regions into temporally evolving communities and learns dynamic unsupervised node embeddings.
arXiv Detail & Related papers (2023-01-26T20:45:30Z) - DynDepNet: Learning Time-Varying Dependency Structures from fMRI Data
via Dynamic Graph Structure Learning [58.94034282469377]
We propose DynDepNet, a novel method for learning the optimal time-varying dependency structure of fMRI data induced by downstream prediction tasks.
Experiments on real-world fMRI datasets, for the task of sex classification, demonstrate that DynDepNet achieves state-of-the-art results.
arXiv Detail & Related papers (2022-09-27T16:32:11Z) - Functional2Structural: Cross-Modality Brain Networks Representation
Learning [55.24969686433101]
Graph mining on brain networks may facilitate the discovery of novel biomarkers for clinical phenotypes and neurodegenerative diseases.
We propose a novel graph learning framework, known as Deep Signed Brain Networks (DSBN), with a signed graph encoder.
We validate our framework on clinical phenotype and neurodegenerative disease prediction tasks using two independent, publicly available datasets.
arXiv Detail & Related papers (2022-05-06T03:45:36Z) - Modeling Spatio-Temporal Dynamics in Brain Networks: A Comparison of
Graph Neural Network Architectures [0.5033155053523041]
Graph neural networks (GNNs) provide a possibility to interpret new structured graph signals.
We show that by learning localized functional interactions on the substrate, GNN based approaches are able to robustly scale to large network studies.
arXiv Detail & Related papers (2021-12-08T12:57:13Z) - Dynamic Adaptive Spatio-temporal Graph Convolution for fMRI Modelling [0.0]
We propose a dynamic adaptivetemporal graph convolution (DASTGCN) model to overcome the shortcomings of pre-defined static correlation-based graph structures.
The proposed approach allows end-to-end inference of dynamic connections between brain regions via layer-wise graph structure learning module.
We evaluate our pipeline on the UKBiobank for age and gender classification tasks from resting-state functional scans.
arXiv Detail & Related papers (2021-09-26T07:19:47Z) - Learning Dynamic Graph Representation of Brain Connectome with
Spatio-Temporal Attention [33.049423523704824]
We propose STAGIN, a method for learning dynamic graph representation of the brain connectome with temporal attention.
Experiments on the HCP-Rest and the HCP-Task datasets demonstrate exceptional performance of our proposed method.
arXiv Detail & Related papers (2021-05-27T23:06:50Z) - Spatio-Temporal Graph Convolution for Resting-State fMRI Analysis [11.85489505372321]
We train a-temporal graph convolutional network (ST-GCN) on short sub-sequences of the BOLD time series to model the non-stationary nature of functional connectivity.
St-GCN is significantly more accurate than common approaches in predicting gender and age based on BOLD signals.
arXiv Detail & Related papers (2020-03-24T01:56:50Z) - Understanding Graph Isomorphism Network for rs-fMRI Functional
Connectivity Analysis [49.05541693243502]
We develop a framework for analyzing fMRI data using the Graph Isomorphism Network (GIN)
One of the important contributions of this paper is the observation that the GIN is a dual representation of convolutional neural network (CNN) in the graph space.
We exploit CNN-based saliency map techniques for the GNN, which we tailor to the proposed GIN with one-hot encoding.
arXiv Detail & Related papers (2020-01-10T23:40:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.