Self-Clustering Graph Transformer Approach to Model Resting-State Functional Brain Activity
- URL: http://arxiv.org/abs/2501.16345v2
- Date: Fri, 07 Feb 2025 08:57:37 GMT
- Title: Self-Clustering Graph Transformer Approach to Model Resting-State Functional Brain Activity
- Authors: Bishal Thapaliya, Esra Akbas, Ram Sapkota, Bhaskar Ray, Vince Calhoun, Jingyu Liu,
- Abstract summary: Self-Clustering Graph Transformer (SCGT) is designed to handle the issue of uniform node updates in graph transformers.
SCGT effectively captures the sub-network structure of the brain by performing cluster-specific updates to the nodes.
We validate our approach on the Adolescent Brain Cognitive Development dataset, comprising 7,957 participants.
- Score: 1.382819379097036
- License:
- Abstract: Resting-state functional magnetic resonance imaging (rs-fMRI) offers valuable insights into the human brain's functional organization and is a powerful tool for investigating the relationship between brain function and cognitive processes, as it allows for the functional organization of the brain to be captured without relying on a specific task or stimuli. In this study, we introduce a novel attention mechanism for graphs with subnetworks, named Self-Clustering Graph Transformer (SCGT), designed to handle the issue of uniform node updates in graph transformers. By using static functional connectivity (FC) correlation features as input to the transformer model, SCGT effectively captures the sub-network structure of the brain by performing cluster-specific updates to the nodes, unlike uniform node updates in vanilla graph transformers, further allowing us to learn and interpret the subclusters. We validate our approach on the Adolescent Brain Cognitive Development (ABCD) dataset, comprising 7,957 participants, for the prediction of total cognitive score and gender classification. Our results demonstrate that SCGT outperforms the vanilla graph transformer method and other recent models, offering a promising tool for modeling brain functional connectivity and interpreting the underlying subnetwork structures.
Related papers
- Cell Graph Transformer for Nuclei Classification [78.47566396839628]
We develop a cell graph transformer (CGT) that treats nodes and edges as input tokens to enable learnable adjacency and information exchange among all nodes.
Poorly features can lead to noisy self-attention scores and inferior convergence.
We propose a novel topology-aware pretraining method that leverages a graph convolutional network (GCN) to learn a feature extractor.
arXiv Detail & Related papers (2024-02-20T12:01:30Z) - Large-scale Graph Representation Learning of Dynamic Brain Connectome
with Transformers [18.304946718572516]
We propose a method for learning the representation of dynamic functional connectivity with Graph Transformers.
Specifically, we define the connectome embedding, which holds the position, structure, and time information of the functional connectivity graph.
We perform experiments with over 50,000 resting-state fMRI samples obtained from three datasets.
arXiv Detail & Related papers (2023-12-04T16:08:44Z) - Language Knowledge-Assisted Representation Learning for Skeleton-Based
Action Recognition [71.35205097460124]
How humans understand and recognize the actions of others is a complex neuroscientific problem.
LA-GCN proposes a graph convolution network using large-scale language models (LLM) knowledge assistance.
arXiv Detail & Related papers (2023-05-21T08:29:16Z) - DBGDGM: Dynamic Brain Graph Deep Generative Model [63.23390833353625]
Graphs are a natural representation of brain activity derived from functional magnetic imaging (fMRI) data.
It is well known that clusters of anatomical brain regions, known as functional connectivity networks (FCNs), encode temporal relationships which can serve as useful biomarkers for understanding brain function and dysfunction.
Previous works, however, ignore the temporal dynamics of the brain and focus on static graphs.
We propose a dynamic brain graph deep generative model (DBGDGM) which simultaneously clusters brain regions into temporally evolving communities and learns dynamic unsupervised node embeddings.
arXiv Detail & Related papers (2023-01-26T20:45:30Z) - Brain Network Transformer [13.239896897835191]
We study Transformer-based models for brain network analysis.
Driven by the unique properties of data, we model brain networks as graphs with nodes of fixed size and order.
We re-standardize the evaluation pipeline on the only one publicly available large-scale brain network dataset of ABIDE.
arXiv Detail & Related papers (2022-10-13T02:30:06Z) - Functional2Structural: Cross-Modality Brain Networks Representation
Learning [55.24969686433101]
Graph mining on brain networks may facilitate the discovery of novel biomarkers for clinical phenotypes and neurodegenerative diseases.
We propose a novel graph learning framework, known as Deep Signed Brain Networks (DSBN), with a signed graph encoder.
We validate our framework on clinical phenotype and neurodegenerative disease prediction tasks using two independent, publicly available datasets.
arXiv Detail & Related papers (2022-05-06T03:45:36Z) - Cross-Frequency Coupling Increases Memory Capacity in Oscillatory Neural
Networks [69.42260428921436]
Cross-frequency coupling (CFC) is associated with information integration across populations of neurons.
We construct a model of CFC which predicts a computational role for observed $theta - gamma$ oscillatory circuits in the hippocampus and cortex.
We show that the presence of CFC increases the memory capacity of a population of neurons connected by plastic synapses.
arXiv Detail & Related papers (2022-04-05T17:13:36Z) - Learning to Model the Relationship Between Brain Structural and
Functional Connectomes [16.096428756895918]
We develop a graph representation learning framework to model the relationship between brainobjective connectivity (SC) and functional connectivity (FC)
A trainable graph convolutional encoder captures interactions between brain regions-of-interest that mimic actual neural communications.
Experiments demonstrate that the learnt representations capture valuable information from the intrinsic properties of the subject's brain networks.
arXiv Detail & Related papers (2021-12-18T11:23:55Z) - Dynamic Adaptive Spatio-temporal Graph Convolution for fMRI Modelling [0.0]
We propose a dynamic adaptivetemporal graph convolution (DASTGCN) model to overcome the shortcomings of pre-defined static correlation-based graph structures.
The proposed approach allows end-to-end inference of dynamic connections between brain regions via layer-wise graph structure learning module.
We evaluate our pipeline on the UKBiobank for age and gender classification tasks from resting-state functional scans.
arXiv Detail & Related papers (2021-09-26T07:19:47Z) - Learning Dynamic Graph Representation of Brain Connectome with
Spatio-Temporal Attention [33.049423523704824]
We propose STAGIN, a method for learning dynamic graph representation of the brain connectome with temporal attention.
Experiments on the HCP-Rest and the HCP-Task datasets demonstrate exceptional performance of our proposed method.
arXiv Detail & Related papers (2021-05-27T23:06:50Z) - TCL: Transformer-based Dynamic Graph Modelling via Contrastive Learning [87.38675639186405]
We propose a novel graph neural network approach, called TCL, which deals with the dynamically-evolving graph in a continuous-time fashion.
To the best of our knowledge, this is the first attempt to apply contrastive learning to representation learning on dynamic graphs.
arXiv Detail & Related papers (2021-05-17T15:33:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.