Reconstructing high-order sequence features of dynamic functional
connectivity networks based on diversified covert attention patterns for
Alzheimer's disease classification
- URL: http://arxiv.org/abs/2211.11750v2
- Date: Mon, 4 Sep 2023 12:05:36 GMT
- Title: Reconstructing high-order sequence features of dynamic functional
connectivity networks based on diversified covert attention patterns for
Alzheimer's disease classification
- Authors: Zhixiang Zhang, Biao Jie, Zhengdong Wang, Jie Zhou, Yang Yang
- Abstract summary: We introduce self-attention mechanism, a core module of Transformers, to model diversified attention patterns and apply these patterns to reconstruct high-order sequence features of dFCNs.
We propose a CRN method based on diversified attention patterns, DCA-CRN, which combines the advantages of CRNs capturing local in-temporal-temporal features and sequence change patterns, as well as Transformers in learning global and high-order correlation features.
- Score: 22.57052592437276
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recent studies have applied deep learning methods such as convolutional
recurrent neural networks (CRNs) and Transformers to brain disease
classification based on dynamic functional connectivity networks (dFCNs), such
as Alzheimer's disease (AD), achieving better performance than traditional
machine learning methods. However, in CRNs, the continuous convolution
operations used to obtain high-order aggregation features may overlook the
non-linear correlation between different brain regions due to the essence of
convolution being the linear weighted sum of local elements. Inspired by modern
neuroscience on the research of covert attention in the nervous system, we
introduce the self-attention mechanism, a core module of Transformers, to model
diversified covert attention patterns and apply these patterns to reconstruct
high-order sequence features of dFCNs in order to learn complex dynamic changes
in brain information flow. Therefore, we propose a novel CRN method based on
diversified covert attention patterns, DCA-CRN, which combines the advantages
of CRNs in capturing local spatio-temporal features and sequence change
patterns, as well as Transformers in learning global and high-order correlation
features. Experimental results on the ADNI and ADHD-200 datasets demonstrate
the prediction performance and generalization ability of our proposed method.
Related papers
- Core-Periphery Principle Guided State Space Model for Functional Connectome Classification [30.044545011553172]
Core-Periphery State-Space Model (CP-SSM) is an innovative framework for functional connectome classification.
Mamba, a selective state-space model with linear complexity, to effectively capture long-range dependencies in functional brain networks.
CP-SSM surpasses Transformer-based models in classification performance while significantly reducing computational complexity.
arXiv Detail & Related papers (2025-03-18T19:03:27Z) - Trainable Adaptive Activation Function Structure (TAAFS) Enhances Neural Network Force Field Performance with Only Dozens of Additional Parameters [0.0]
Trainable Adaptive Function Activation Structure (TAAFS)
We introduce a method that selects distinct mathematical formulations for non-linear activations.
In this study, we integrate TAAFS into a variety of neural network models, resulting in observed accuracy improvements.
arXiv Detail & Related papers (2024-12-19T09:06:39Z) - Deep-Unrolling Multidimensional Harmonic Retrieval Algorithms on Neuromorphic Hardware [78.17783007774295]
This paper explores the potential of conversion-based neuromorphic algorithms for highly accurate and energy-efficient single-snapshot multidimensional harmonic retrieval.
A novel method for converting the complex-valued convolutional layers and activations into spiking neural networks (SNNs) is developed.
The converted SNNs achieve almost five-fold power efficiency at moderate performance loss compared to the original CNNs.
arXiv Detail & Related papers (2024-12-05T09:41:33Z) - Generative forecasting of brain activity enhances Alzheimer's classification and interpretation [16.09844316281377]
Resting-state functional magnetic resonance imaging (rs-fMRI) offers a non-invasive method to monitor neural activity.
Deep learning has shown promise in capturing these representations.
In this study, we focus on time series forecasting of independent component networks derived from rs-fMRI as a form of data augmentation.
arXiv Detail & Related papers (2024-10-30T23:51:31Z) - TAVRNN: Temporal Attention-enhanced Variational Graph RNN Captures Neural Dynamics and Behavior [2.5282283486446757]
We introduce Temporal Attention-enhanced Variational Graph Recurrent Neural Network (TAVRNN)
TAVRNN captures temporal changes in network structure by modeling sequential snapshots of neuronal activity.
We show that TAVRNN outperforms previous baseline models in classification, clustering tasks and computational efficiency.
arXiv Detail & Related papers (2024-10-01T13:19:51Z) - Neural Dynamics Model of Visual Decision-Making: Learning from Human Experts [28.340344705437758]
We implement a comprehensive visual decision-making model that spans from visual input to behavioral output.
Our model aligns closely with human behavior and reflects neural activities in primates.
A neuroimaging-informed fine-tuning approach was introduced and applied to the model, leading to performance improvements.
arXiv Detail & Related papers (2024-09-04T02:38:52Z) - Exploring neural oscillations during speech perception via surrogate gradient spiking neural networks [59.38765771221084]
We present a physiologically inspired speech recognition architecture compatible and scalable with deep learning frameworks.
We show end-to-end gradient descent training leads to the emergence of neural oscillations in the central spiking neural network.
Our findings highlight the crucial inhibitory role of feedback mechanisms, such as spike frequency adaptation and recurrent connections, in regulating and synchronising neural activity to improve recognition performance.
arXiv Detail & Related papers (2024-04-22T09:40:07Z) - An Association Test Based on Kernel-Based Neural Networks for Complex
Genetic Association Analysis [0.8221435109014762]
We develop a kernel-based neural network model (KNN) that synergizes the strengths of linear mixed models with conventional neural networks.
MINQUE-based test to assess the joint association of genetic variants with the phenotype.
Two additional tests to evaluate and interpret linear and non-linear/non-additive genetic effects.
arXiv Detail & Related papers (2023-12-06T05:02:28Z) - Neural oscillators for magnetic hysteresis modeling [0.7444373636055321]
Hysteresis is a ubiquitous phenomenon in science and engineering.
We develop an ordinary differential equation-based recurrent neural network (RNN) approach to model and quantify the phenomenon.
arXiv Detail & Related papers (2023-08-23T08:41:24Z) - Cross-Frequency Coupling Increases Memory Capacity in Oscillatory Neural
Networks [69.42260428921436]
Cross-frequency coupling (CFC) is associated with information integration across populations of neurons.
We construct a model of CFC which predicts a computational role for observed $theta - gamma$ oscillatory circuits in the hippocampus and cortex.
We show that the presence of CFC increases the memory capacity of a population of neurons connected by plastic synapses.
arXiv Detail & Related papers (2022-04-05T17:13:36Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - PredRNN: A Recurrent Neural Network for Spatiotemporal Predictive
Learning [109.84770951839289]
We present PredRNN, a new recurrent network for learning visual dynamics from historical context.
We show that our approach obtains highly competitive results on three standard datasets.
arXiv Detail & Related papers (2021-03-17T08:28:30Z) - Provably Efficient Neural Estimation of Structural Equation Model: An
Adversarial Approach [144.21892195917758]
We study estimation in a class of generalized Structural equation models (SEMs)
We formulate the linear operator equation as a min-max game, where both players are parameterized by neural networks (NNs), and learn the parameters of these neural networks using a gradient descent.
For the first time we provide a tractable estimation procedure for SEMs based on NNs with provable convergence and without the need for sample splitting.
arXiv Detail & Related papers (2020-07-02T17:55:47Z) - Recurrent Neural Network Learning of Performance and Intrinsic
Population Dynamics from Sparse Neural Data [77.92736596690297]
We introduce a novel training strategy that allows learning not only the input-output behavior of an RNN but also its internal network dynamics.
We test the proposed method by training an RNN to simultaneously reproduce internal dynamics and output signals of a physiologically-inspired neural model.
Remarkably, we show that the reproduction of the internal dynamics is successful even when the training algorithm relies on the activities of a small subset of neurons.
arXiv Detail & Related papers (2020-05-05T14:16:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.