Predicting cognitive scores with graph neural networks through sample
selection learning
- URL: http://arxiv.org/abs/2106.09408v1
- Date: Thu, 17 Jun 2021 11:45:39 GMT
- Title: Predicting cognitive scores with graph neural networks through sample
selection learning
- Authors: Martin Hanik, Mehmet Arif Demirta\c{s}, Mohammed Amine Gharsallaoui,
Islem Rekik
- Abstract summary: Functional brain connectomes are used to predict cognitive measures such as intelligence quotient (IQ) scores.
We design a novel regression GNN model (namely RegGNN) for predicting IQ scores from brain connectivity.
We also propose a emphlearning-based sample selection method that learns how to choose the training samples with the highest expected predictive power.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Analyzing the relation between intelligence and neural activity is of the
utmost importance in understanding the working principles of the human brain in
health and disease. In existing literature, functional brain connectomes have
been used successfully to predict cognitive measures such as intelligence
quotient (IQ) scores in both healthy and disordered cohorts using machine
learning models. However, existing methods resort to flattening the brain
connectome (i.e., graph) through vectorization which overlooks its topological
properties. To address this limitation and inspired from the emerging graph
neural networks (GNNs), we design a novel regression GNN model (namely RegGNN)
for predicting IQ scores from brain connectivity. On top of that, we introduce
a novel, fully modular sample selection method to select the best samples to
learn from for our target prediction task. However, since such deep learning
architectures are computationally expensive to train, we further propose a
\emph{learning-based sample selection} method that learns how to choose the
training samples with the highest expected predictive power on unseen samples.
For this, we capitalize on the fact that connectomes (i.e., their adjacency
matrices) lie in the symmetric positive definite (SPD) matrix cone. Our results
on full-scale and verbal IQ prediction outperforms comparison methods in autism
spectrum disorder cohorts and achieves a competitive performance for
neurotypical subjects using 3-fold cross-validation. Furthermore, we show that
our sample selection approach generalizes to other learning-based methods,
which shows its usefulness beyond our GNN architecture.
Related papers
- Towards a Foundation Model for Brain Age Prediction using coVariance
Neural Networks [102.75954614946258]
Increasing brain age with respect to chronological age can reflect increased vulnerability to neurodegeneration and cognitive decline.
NeuroVNN is pre-trained as a regression model on healthy population to predict chronological age.
NeuroVNN adds anatomical interpretability to brain age and has a scale-free' characteristic that allows its transference to datasets curated according to any arbitrary brain atlas.
arXiv Detail & Related papers (2024-02-12T14:46:31Z) - Neuroformer: Multimodal and Multitask Generative Pretraining for Brain Data [3.46029409929709]
State-of-the-art systems neuroscience experiments yield large-scale multimodal data, and these data sets require new tools for analysis.
Inspired by the success of large pretrained models in vision and language domains, we reframe the analysis of large-scale, cellular-resolution neuronal spiking data into an autoregressive generation problem.
We first trained Neuroformer on simulated datasets, and found that it both accurately predicted intrinsically simulated neuronal circuit activity, and also inferred the underlying neural circuit connectivity, including direction.
arXiv Detail & Related papers (2023-10-31T20:17:32Z) - MBrain: A Multi-channel Self-Supervised Learning Framework for Brain
Signals [7.682832730967219]
We study the self-supervised learning framework for brain signals that can be applied to pre-train either SEEG or EEG data.
Inspired by this, we propose MBrain to learn implicit spatial and temporal correlations between different channels.
Our model outperforms several state-of-the-art time series SSL and unsupervised models, and has the ability to be deployed to clinical practice.
arXiv Detail & Related papers (2023-06-15T09:14:26Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Meta-RegGNN: Predicting Verbal and Full-Scale Intelligence Scores using
Graph Neural Networks and Meta-Learning [0.9137554315375922]
We propose a novel regression graph neural network through meta-learning namely Meta-RegGNN for predicting behavioral scores from brain connectomes.
Our results on verbal and full-scale intelligence quotient (IQ) prediction outperform existing methods in both neurotypical and autism spectrum disorder cohorts.
arXiv Detail & Related papers (2022-09-14T07:19:03Z) - Knowledge Enhanced Neural Networks for relational domains [83.9217787335878]
We focus on a specific method, KENN, a Neural-Symbolic architecture that injects prior logical knowledge into a neural network.
In this paper, we propose an extension of KENN for relational data.
arXiv Detail & Related papers (2022-05-31T13:00:34Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - One Representative-Shot Learning Using a Population-Driven Template with
Application to Brain Connectivity Classification and Evolution Prediction [0.0]
Graph neural networks (GNNs) have been introduced to the field of network neuroscience.
We take a very different approach in training GNNs, where we aim to learn with one sample and achieve the best performance.
We present the first one-shot paradigm where a GNN is trained on a single population-driven template.
arXiv Detail & Related papers (2021-10-06T08:36:00Z) - The Neural Coding Framework for Learning Generative Models [91.0357317238509]
We propose a novel neural generative model inspired by the theory of predictive processing in the brain.
In a similar way, artificial neurons in our generative model predict what neighboring neurons will do, and adjust their parameters based on how well the predictions matched reality.
arXiv Detail & Related papers (2020-12-07T01:20:38Z) - Neural Networks Enhancement with Logical Knowledge [83.9217787335878]
We propose an extension of KENN for relational data.
The results show that KENN is capable of increasing the performances of the underlying neural network even in the presence relational data.
arXiv Detail & Related papers (2020-09-13T21:12:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.