Machine Learning Based Multimodal Neuroimaging Genomics Dementia Score
for Predicting Future Conversion to Alzheimer's Disease
- URL: http://arxiv.org/abs/2203.05707v1
- Date: Fri, 11 Mar 2022 01:35:30 GMT
- Title: Machine Learning Based Multimodal Neuroimaging Genomics Dementia Score
for Predicting Future Conversion to Alzheimer's Disease
- Authors: Ghazal Mirabnahrazam, Da Ma, Sieun Lee, Karteek Popuri, Hyunwoo Lee,
Jiguo Cao, Lei Wang, James E Galvin, Mirza Faisal Beg, and the Alzheimer's
Disease Neuroimaging Initiative
- Abstract summary: We developed an image/genotype-based DAT score that represents a subject's likelihood of developing DAT in the future.
Using a pre-defined 0.5 threshold on DAT scores, we predicted whether or not a subject would develop DAT in the future.
- Score: 2.914776804701307
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Background: The increasing availability of databases containing both magnetic
resonance imaging (MRI) and genetic data allows researchers to utilize
multimodal data to better understand the characteristics of dementia of
Alzheimer's type (DAT). Objective: The goal of this study was to develop and
analyze novel biomarkers that can help predict the development and progression
of DAT. Methods: We used feature selection and ensemble learning classifier to
develop an image/genotype-based DAT score that represents a subject's
likelihood of developing DAT in the future. Three feature types were used: MRI
only, genetic only, and combined multimodal data. We used a novel data
stratification method to better represent different stages of DAT. Using a
pre-defined 0.5 threshold on DAT scores, we predicted whether or not a subject
would develop DAT in the future. Results: Our results on Alzheimer's Disease
Neuroimaging Initiative (ADNI) database showed that dementia scores using
genetic data could better predict future DAT progression for currently normal
control subjects (Accuracy=0.857) compared to MRI (Accuracy=0.143), while MRI
can better characterize subjects with stable mild cognitive impairment
(Accuracy=0.614) compared to genetics (Accuracy=0.356). Combining MRI and
genetic data showed improved classification performance in the remaining
stratified groups. Conclusion: MRI and genetic data can contribute to DAT
prediction in different ways. MRI data reflects anatomical changes in the
brain, while genetic data can detect the risk of DAT progression prior to the
symptomatic onset. Combining information from multimodal data in the right way
can improve prediction performance.
Related papers
- Individualized multi-horizon MRI trajectory prediction for Alzheimer's Disease [0.0]
We train a novel architecture to build a latent space distribution which can be sampled from to generate future predictions of changing anatomy.
By comparing to several alternatives, we show that our model produces more individualized images with higher resolution.
arXiv Detail & Related papers (2024-08-04T13:09:06Z) - GFE-Mamba: Mamba-based AD Multi-modal Progression Assessment via Generative Feature Extraction from MCI [5.355943545567233]
Alzheimer's Disease (AD) is an irreversible neurodegenerative disorder that often progresses from Mild Cognitive Impairment (MCI)
We introduce GFE-Mamba, a classifier based on Generative Feature Extraction (GFE)
It integrates data from assessment scales, MRI, and PET, enabling deeper multimodal fusion.
Our experimental results demonstrate that the GFE-Mamba model is effective in predicting the conversion from MCI to AD.
arXiv Detail & Related papers (2024-07-22T15:22:33Z) - A Demographic-Conditioned Variational Autoencoder for fMRI Distribution Sampling and Removal of Confounds [49.34500499203579]
We create a variational autoencoder (VAE)-based model, DemoVAE, to decorrelate fMRI features from demographics.
We generate high-quality synthetic fMRI data based on user-supplied demographics.
arXiv Detail & Related papers (2024-05-13T17:49:20Z) - Genetic InfoMax: Exploring Mutual Information Maximization in
High-Dimensional Imaging Genetics Studies [50.11449968854487]
Genome-wide association studies (GWAS) are used to identify relationships between genetic variations and specific traits.
Representation learning for imaging genetics is largely under-explored due to the unique challenges posed by GWAS.
We introduce a trans-modal learning framework Genetic InfoMax (GIM) to address the specific challenges of GWAS.
arXiv Detail & Related papers (2023-09-26T03:59:21Z) - Source-Free Collaborative Domain Adaptation via Multi-Perspective
Feature Enrichment for Functional MRI Analysis [55.03872260158717]
Resting-state MRI functional (rs-fMRI) is increasingly employed in multi-site research to aid neurological disorder analysis.
Many methods have been proposed to reduce fMRI heterogeneity between source and target domains.
But acquiring source data is challenging due to concerns and/or data storage burdens in multi-site studies.
We design a source-free collaborative domain adaptation framework for fMRI analysis, where only a pretrained source model and unlabeled target data are accessible.
arXiv Detail & Related papers (2023-08-24T01:30:18Z) - Copy Number Variation Informs fMRI-based Prediction of Autism Spectrum
Disorder [9.544191399458954]
We develop a more integrative model for combining genetic, demographic, and neuroimaging data.
Inspired by the influence of genotype on phenotype, we propose using an attention-based approach.
We evaluate the proposed approach on ASD classification and severity prediction tasks, using a sex-balanced dataset of 228 ASD.
arXiv Detail & Related papers (2023-08-08T19:53:43Z) - Incomplete Multimodal Learning for Complex Brain Disorders Prediction [65.95783479249745]
We propose a new incomplete multimodal data integration approach that employs transformers and generative adversarial networks.
We apply our new method to predict cognitive degeneration and disease outcomes using the multimodal imaging genetic data from Alzheimer's Disease Neuroimaging Initiative cohort.
arXiv Detail & Related papers (2023-05-25T16:29:16Z) - Predicting Time-to-conversion for Dementia of Alzheimer's Type using
Multi-modal Deep Survival Analysis [2.914776804701307]
We used 401 subjects with 63 features from MRI, genetic, and CDC data modalities in the Alzheimer's Disease Neuroimaging Initiative database.
Our findings showed that genetic features contributed the least to survival analysis, while CDC features contributed the most.
arXiv Detail & Related papers (2022-05-02T20:10:10Z) - G-MIND: An End-to-End Multimodal Imaging-Genetics Framework for
Biomarker Identification and Disease Classification [49.53651166356737]
We propose a novel deep neural network architecture to integrate imaging and genetics data, as guided by diagnosis, that provides interpretable biomarkers.
We have evaluated our model on a population study of schizophrenia that includes two functional MRI (fMRI) paradigms and Single Nucleotide Polymorphism (SNP) data.
arXiv Detail & Related papers (2021-01-27T19:28:04Z) - Fader Networks for domain adaptation on fMRI: ABIDE-II study [68.5481471934606]
We use 3D convolutional autoencoders to build the domain irrelevant latent space image representation and demonstrate this method to outperform existing approaches on ABIDE data.
arXiv Detail & Related papers (2020-10-14T16:50:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.