Disentangling Spatial-Temporal Functional Brain Networks via
Twin-Transformers
- URL: http://arxiv.org/abs/2204.09225v1
- Date: Wed, 20 Apr 2022 04:57:53 GMT
- Title: Disentangling Spatial-Temporal Functional Brain Networks via
Twin-Transformers
- Authors: Xiaowei Yu, Lu Zhang, Lin Zhao, Yanjun Lyu, Tianming Liu, Dajiang Zhu
- Abstract summary: How to identify and characterize functional brain networks (BN) is fundamental to gain system-level insights into the mechanisms of brain organization architecture.
We propose a novel Twin-Transformers framework to simultaneously infer common and individual functional networks in both spatial and temporal space.
- Score: 12.137308815848717
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: How to identify and characterize functional brain networks (BN) is
fundamental to gain system-level insights into the mechanisms of brain
organizational architecture. Current functional magnetic resonance (fMRI)
analysis highly relies on prior knowledge of specific patterns in either
spatial (e.g., resting-state network) or temporal (e.g., task stimulus) domain.
In addition, most approaches aim to find group-wise common functional networks,
individual-specific functional networks have been rarely studied. In this work,
we propose a novel Twin-Transformers framework to simultaneously infer common
and individual functional networks in both spatial and temporal space, in a
self-supervised manner. The first transformer takes space-divided information
as input and generates spatial features, while the second transformer takes
time-related information as input and outputs temporal features. The spatial
and temporal features are further separated into common and individual ones via
interactions (weights sharing) and constraints between the two transformers. We
applied our TwinTransformers to Human Connectome Project (HCP) motor task-fMRI
dataset and identified multiple common brain networks, including both
task-related and resting-state networks (e.g., default mode network).
Interestingly, we also successfully recovered a set of individual-specific
networks that are not related to task stimulus and only exist at the individual
level.
Related papers
- Neuromorphic Wireless Split Computing with Multi-Level Spikes [69.73249913506042]
In neuromorphic computing, spiking neural networks (SNNs) perform inference tasks, offering significant efficiency gains for workloads involving sequential data.
Recent advances in hardware and software have demonstrated that embedding a few bits of payload in each spike exchanged between the spiking neurons can further enhance inference accuracy.
This paper investigates a wireless neuromorphic split computing architecture employing multi-level SNNs.
arXiv Detail & Related papers (2024-11-07T14:08:35Z) - Interpretable Spatio-Temporal Embedding for Brain Structural-Effective Network with Ordinary Differential Equation [56.34634121544929]
In this study, we first construct the brain-effective network via the dynamic causal model.
We then introduce an interpretable graph learning framework termed Spatio-Temporal Embedding ODE (STE-ODE)
This framework incorporates specifically designed directed node embedding layers, aiming at capturing the dynamic interplay between structural and effective networks.
arXiv Detail & Related papers (2024-05-21T20:37:07Z) - Volume-Preserving Transformers for Learning Time Series Data with Structure [0.0]
We develop a transformer-inspired neural network and use it to learn a dynamical system.
We change the activation function of the attention layer to imbue the transformer with structure-preserving properties.
This is shown to be of great advantage when applying the neural network to learning the trajectory of a rigid body.
arXiv Detail & Related papers (2023-12-18T13:09:55Z) - Seeing double with a multifunctional reservoir computer [0.0]
Multifunctional biological neural networks exploit multistability in order to perform multiple tasks without changing any network properties.
We study how a reservoir computer reconstructs a coexistence of attractors when there is an overlap between them.
A bifurcation analysis reveals how multifunctionality emerges and is destroyed as the RC enters a chaotic regime.
arXiv Detail & Related papers (2023-05-09T23:10:29Z) - Deeply-Coupled Convolution-Transformer with Spatial-temporal
Complementary Learning for Video-based Person Re-identification [91.56939957189505]
We propose a novel spatial-temporal complementary learning framework named Deeply-Coupled Convolution-Transformer (DCCT) for high-performance video-based person Re-ID.
Our framework could attain better performances than most state-of-the-art methods.
arXiv Detail & Related papers (2023-04-27T12:16:44Z) - An intertwined neural network model for EEG classification in
brain-computer interfaces [0.6696153817334769]
The brain computer interface (BCI) is a nonstimulatory direct and occasionally bidirectional communication link between the brain and a computer or an external device.
We present a deep neural network architecture specifically engineered to provide state-of-the-art performance in multiclass motor imagery classification.
arXiv Detail & Related papers (2022-08-04T09:00:34Z) - Functional2Structural: Cross-Modality Brain Networks Representation
Learning [55.24969686433101]
Graph mining on brain networks may facilitate the discovery of novel biomarkers for clinical phenotypes and neurodegenerative diseases.
We propose a novel graph learning framework, known as Deep Signed Brain Networks (DSBN), with a signed graph encoder.
We validate our framework on clinical phenotype and neurodegenerative disease prediction tasks using two independent, publicly available datasets.
arXiv Detail & Related papers (2022-05-06T03:45:36Z) - Spatio-Temporal Representation Factorization for Video-based Person
Re-Identification [55.01276167336187]
We propose Spatio-Temporal Representation Factorization module (STRF) for re-ID.
STRF is a flexible new computational unit that can be used in conjunction with most existing 3D convolutional neural network architectures for re-ID.
We empirically show that STRF improves performance of various existing baseline architectures while demonstrating new state-of-the-art results.
arXiv Detail & Related papers (2021-07-25T19:29:37Z) - Slow manifolds in recurrent networks encode working memory efficiently
and robustly [0.0]
Working memory is a cognitive function involving the storage and manipulation of latent information over brief intervals of time.
We use a top-down modeling approach to examine network-level mechanisms of working memory.
arXiv Detail & Related papers (2021-01-08T18:47:02Z) - Co-evolution of Functional Brain Network at Multiple Scales during Early
Infancy [52.4179778122852]
This paper leveraged a longitudinal infant resting-state functional magnetic resonance imaging dataset from birth to 2 years of age.
By applying our proposed methodological framework on the collected longitudinal infant dataset, we provided the first evidence that, in the first 2 years of life, the brain functional network is co-evolved at different scales.
arXiv Detail & Related papers (2020-09-15T07:21:04Z) - Detecting Dynamic Community Structure in Functional Brain Networks
Across Individuals: A Multilayer Approach [12.923521418531655]
We present a unified statistical framework for characterizing community structure of brain functional networks.
We propose a multi-subject, Markov-switching block model (MSS-SBM) to identify changes in brain organization over a group of individuals.
arXiv Detail & Related papers (2020-04-09T04:23:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.