Brain Network Transformer
- URL: http://arxiv.org/abs/2210.06681v2
- Date: Sat, 15 Oct 2022 19:46:45 GMT
- Title: Brain Network Transformer
- Authors: Xuan Kan, Wei Dai, Hejie Cui, Zilong Zhang, Ying Guo, Carl Yang
- Abstract summary: We study Transformer-based models for brain network analysis.
Driven by the unique properties of data, we model brain networks as graphs with nodes of fixed size and order.
We re-standardize the evaluation pipeline on the only one publicly available large-scale brain network dataset of ABIDE.
- Score: 13.239896897835191
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Human brains are commonly modeled as networks of Regions of Interest (ROIs)
and their connections for the understanding of brain functions and mental
disorders. Recently, Transformer-based models have been studied over different
types of data, including graphs, shown to bring performance gains widely. In
this work, we study Transformer-based models for brain network analysis. Driven
by the unique properties of data, we model brain networks as graphs with nodes
of fixed size and order, which allows us to (1) use connection profiles as node
features to provide natural and low-cost positional information and (2) learn
pair-wise connection strengths among ROIs with efficient attention weights
across individuals that are predictive towards downstream analysis tasks.
Moreover, we propose an Orthonormal Clustering Readout operation based on
self-supervised soft clustering and orthonormal projection. This design
accounts for the underlying functional modules that determine similar behaviors
among groups of ROIs, leading to distinguishable cluster-aware node embeddings
and informative graph embeddings. Finally, we re-standardize the evaluation
pipeline on the only one publicly available large-scale brain network dataset
of ABIDE, to enable meaningful comparison of different models. Experiment
results show clear improvements of our proposed Brain Network Transformer on
both the public ABIDE and our restricted ABCD datasets. The implementation is
available at https://github.com/Wayfear/BrainNetworkTransformer.
Related papers
- Cognitive Networks and Performance Drive fMRI-Based State Classification Using DNN Models [0.0]
We employ two structurally different and complementary DNN-based models to classify individual cognitive states.
We show that despite the architectural differences, both models consistently produce a robust relationship between prediction accuracy and individual cognitive performance.
arXiv Detail & Related papers (2024-08-14T15:25:51Z) - Predicting Infant Brain Connectivity with Federated Multi-Trajectory
GNNs using Scarce Data [54.55126643084341]
Existing deep learning solutions suffer from three major limitations.
We introduce FedGmTE-Net++, a federated graph-based multi-trajectory evolution network.
Using the power of federation, we aggregate local learnings among diverse hospitals with limited datasets.
arXiv Detail & Related papers (2024-01-01T10:20:01Z) - PTGB: Pre-Train Graph Neural Networks for Brain Network Analysis [39.16619345610152]
We propose PTGB, a GNN pre-training framework that captures intrinsic brain network structures, regardless of clinical outcomes, and is easily adaptable to various downstream tasks.
PTGB comprises two key components: (1) an unsupervised pre-training technique designed specifically for brain networks, which enables learning from large-scale datasets without task-specific labels; (2) a data-driven parcellation atlas mapping pipeline that facilitates knowledge transfer across datasets with different ROI systems.
arXiv Detail & Related papers (2023-05-20T21:07:47Z) - Multi-Head Graph Convolutional Network for Structural Connectome
Classification [8.658134276685404]
We propose a machine-learning model inspired by graph convolutional networks (GCNs)
The proposed network is a simple design that employs different heads involving graph convolutions focused on edges and nodes.
To test the ability of our model to extract complementary and representative features from brain connectivity data, we chose the task of sex classification.
arXiv Detail & Related papers (2023-05-02T15:04:30Z) - Functional Neural Networks: Shift invariant models for functional data
with applications to EEG classification [0.0]
We introduce a new class of neural networks that are shift invariant and preserve smoothness of the data: functional neural networks (FNNs)
For this, we use methods from functional data analysis (FDA) to extend multi-layer perceptrons and convolutional neural networks to functional data.
We show that the models outperform a benchmark model from FDA in terms of accuracy and successfully use FNNs to classify electroencephalography (EEG) data.
arXiv Detail & Related papers (2023-01-14T09:41:21Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Simplifying approach to Node Classification in Graph Neural Networks [7.057970273958933]
We decouple the node feature aggregation step and depth of graph neural network, and empirically analyze how different aggregated features play a role in prediction performance.
We show that not all features generated via aggregation steps are useful, and often using these less informative features can be detrimental to the performance of the GNN model.
We present a simple and shallow model, Feature Selection Graph Neural Network (FSGNN), and show empirically that the proposed model achieves comparable or even higher accuracy than state-of-the-art GNN models.
arXiv Detail & Related papers (2021-11-12T14:53:22Z) - Explicit Pairwise Factorized Graph Neural Network for Semi-Supervised
Node Classification [59.06717774425588]
We propose the Explicit Pairwise Factorized Graph Neural Network (EPFGNN), which models the whole graph as a partially observed Markov Random Field.
It contains explicit pairwise factors to model output-output relations and uses a GNN backbone to model input-output relations.
We conduct experiments on various datasets, which shows that our model can effectively improve the performance for semi-supervised node classification on graphs.
arXiv Detail & Related papers (2021-07-27T19:47:53Z) - Pre-Trained Models for Heterogeneous Information Networks [57.78194356302626]
We propose a self-supervised pre-training and fine-tuning framework, PF-HIN, to capture the features of a heterogeneous information network.
PF-HIN consistently and significantly outperforms state-of-the-art alternatives on each of these tasks, on four datasets.
arXiv Detail & Related papers (2020-07-07T03:36:28Z) - The Heterogeneity Hypothesis: Finding Layer-Wise Differentiated Network
Architectures [179.66117325866585]
We investigate a design space that is usually overlooked, i.e. adjusting the channel configurations of predefined networks.
We find that this adjustment can be achieved by shrinking widened baseline networks and leads to superior performance.
Experiments are conducted on various networks and datasets for image classification, visual tracking and image restoration.
arXiv Detail & Related papers (2020-06-29T17:59:26Z) - Model Fusion via Optimal Transport [64.13185244219353]
We present a layer-wise model fusion algorithm for neural networks.
We show that this can successfully yield "one-shot" knowledge transfer between neural networks trained on heterogeneous non-i.i.d. data.
arXiv Detail & Related papers (2019-10-12T22:07:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.