A Generative Self-Supervised Framework using Functional Connectivity in
fMRI Data
- URL: http://arxiv.org/abs/2312.01994v1
- Date: Mon, 4 Dec 2023 16:14:43 GMT
- Title: A Generative Self-Supervised Framework using Functional Connectivity in
fMRI Data
- Authors: Jungwon Choi, Seongho Keum, EungGu Yun, Byung-Hoon Kim, Juho Lee
- Abstract summary: Deep neural networks trained on Functional Connectivity (FC) networks extracted from functional Magnetic Resonance Imaging (fMRI) data have gained popularity.
Recent research on the application of Graph Neural Network (GNN) to FC suggests that exploiting the time-varying properties of the FC could significantly improve the accuracy and interpretability of the model prediction.
High cost of acquiring high-quality fMRI data and corresponding labels poses a hurdle to their application in real-world settings.
We propose a generative SSL approach that is tailored to effectively harnesstemporal information within dynamic FC.
- Score: 15.211387244155725
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep neural networks trained on Functional Connectivity (FC) networks
extracted from functional Magnetic Resonance Imaging (fMRI) data have gained
popularity due to the increasing availability of data and advances in model
architectures, including Graph Neural Network (GNN). Recent research on the
application of GNN to FC suggests that exploiting the time-varying properties
of the FC could significantly improve the accuracy and interpretability of the
model prediction. However, the high cost of acquiring high-quality fMRI data
and corresponding phenotypic labels poses a hurdle to their application in
real-world settings, such that a model na\"ively trained in a supervised
fashion can suffer from insufficient performance or a lack of generalization on
a small number of data. In addition, most Self-Supervised Learning (SSL)
approaches for GNNs to date adopt a contrastive strategy, which tends to lose
appropriate semantic information when the graph structure is perturbed or does
not leverage both spatial and temporal information simultaneously. In light of
these challenges, we propose a generative SSL approach that is tailored to
effectively harness spatio-temporal information within dynamic FC. Our
empirical results, experimented with large-scale (>50,000) fMRI datasets,
demonstrate that our approach learns valuable representations and enables the
construction of accurate and robust models when fine-tuned for downstream
tasks.
Related papers
- Sampling-guided Heterogeneous Graph Neural Network with Temporal Smoothing for Scalable Longitudinal Data Imputation [17.81217890585335]
We propose a novel framework, the Sampling-guided Heterogeneous Graph Neural Network (SHT-GNN), to tackle the challenge of missing data imputation.
By leveraging subject-wise mini-batch sampling and a multi-layer temporal smoothing mechanism, SHT-GNN efficiently scales to large datasets.
Experiments on both synthetic and real-world datasets, including the Alzheimer's Disease Neuroimaging Initiative (ADNI) dataset, demonstrate that SHT-GNN significantly outperforms existing imputation methods.
arXiv Detail & Related papers (2024-11-07T17:41:07Z) - Assessing Neural Network Representations During Training Using
Noise-Resilient Diffusion Spectral Entropy [55.014926694758195]
Entropy and mutual information in neural networks provide rich information on the learning process.
We leverage data geometry to access the underlying manifold and reliably compute these information-theoretic measures.
We show that they form noise-resistant measures of intrinsic dimensionality and relationship strength in high-dimensional simulated data.
arXiv Detail & Related papers (2023-12-04T01:32:42Z) - Label Deconvolution for Node Representation Learning on Large-scale
Attributed Graphs against Learning Bias [75.44877675117749]
We propose an efficient label regularization technique, namely Label Deconvolution (LD), to alleviate the learning bias by a novel and highly scalable approximation to the inverse mapping of GNNs.
Experiments demonstrate LD significantly outperforms state-of-the-art methods on Open Graph datasets Benchmark.
arXiv Detail & Related papers (2023-09-26T13:09:43Z) - Graph-enabled Reinforcement Learning for Time Series Forecasting with
Adaptive Intelligence [11.249626785206003]
We propose a novel approach for predicting time-series data using Graphical neural network (GNN) and monitoring with Reinforcement Learning (RL)
GNNs are able to explicitly incorporate the graph structure of the data into the model, allowing them to capture temporal dependencies in a more natural way.
This approach allows for more accurate predictions in complex temporal structures, such as those found in healthcare, traffic and weather forecasting.
arXiv Detail & Related papers (2023-09-18T22:25:12Z) - Benchmarking Graph Neural Networks for FMRI analysis [0.0]
Graph Neural Networks (GNNs) have emerged as a powerful tool to learn from graph-structured data.
We study and evaluate the performance of five popular GNN architectures in diagnosing major depression disorder and autism spectrum disorder.
We highlight that creating optimal graph structures for functional brain data is a major bottleneck hindering the performance of GNNs.
arXiv Detail & Related papers (2022-11-16T14:16:54Z) - DynDepNet: Learning Time-Varying Dependency Structures from fMRI Data
via Dynamic Graph Structure Learning [58.94034282469377]
We propose DynDepNet, a novel method for learning the optimal time-varying dependency structure of fMRI data induced by downstream prediction tasks.
Experiments on real-world fMRI datasets, for the task of sex classification, demonstrate that DynDepNet achieves state-of-the-art results.
arXiv Detail & Related papers (2022-09-27T16:32:11Z) - Leveraging the structure of dynamical systems for data-driven modeling [111.45324708884813]
We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
arXiv Detail & Related papers (2021-12-15T20:09:20Z) - Exploiting Spiking Dynamics with Spatial-temporal Feature Normalization
in Graph Learning [9.88508686848173]
Biological spiking neurons with intrinsic dynamics underlie the powerful representation and learning capabilities of the brain.
Despite recent tremendous progress in spiking neural networks (SNNs) for handling Euclidean-space tasks, it still remains challenging to exploit SNNs in processing non-Euclidean-space data.
Here we present a general spike-based modeling framework that enables the direct training of SNNs for graph learning.
arXiv Detail & Related papers (2021-06-30T11:20:16Z) - Rank-R FNN: A Tensor-Based Learning Model for High-Order Data
Classification [69.26747803963907]
Rank-R Feedforward Neural Network (FNN) is a tensor-based nonlinear learning model that imposes Canonical/Polyadic decomposition on its parameters.
First, it handles inputs as multilinear arrays, bypassing the need for vectorization, and can thus fully exploit the structural information along every data dimension.
We establish the universal approximation and learnability properties of Rank-R FNN, and we validate its performance on real-world hyperspectral datasets.
arXiv Detail & Related papers (2021-04-11T16:37:32Z) - PredRNN: A Recurrent Neural Network for Spatiotemporal Predictive
Learning [109.84770951839289]
We present PredRNN, a new recurrent network for learning visual dynamics from historical context.
We show that our approach obtains highly competitive results on three standard datasets.
arXiv Detail & Related papers (2021-03-17T08:28:30Z) - Graph Neural Networks for Leveraging Industrial Equipment Structure: An
application to Remaining Useful Life Estimation [21.297461316329453]
We propose to capture the structure of a complex equipment in the form of a graph, and use graph neural networks (GNNs) to model multi-sensor time-series data.
We observe that the proposed GNN-based RUL estimation model compares favorably to several strong baselines from literature such as those based on RNNs and CNNs.
arXiv Detail & Related papers (2020-06-30T06:38:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.