Temporal Dynamic Model for Resting State fMRI Data: A Neural Ordinary
Differential Equation approach
- URL: http://arxiv.org/abs/2011.08146v1
- Date: Mon, 16 Nov 2020 18:16:19 GMT
- Title: Temporal Dynamic Model for Resting State fMRI Data: A Neural Ordinary
Differential Equation approach
- Authors: Zheyu Wen
- Abstract summary: The objective of this paper is to provide a temporal dynamic model for resting state functional Magnetic Resonance Imaging (fMRI) trajectory to predict future brain images based on the given sequence.
To this end, we came up with the model that takes advantage of representation learning and Neural Ordinary Differential Equation (Neural ODE) to compress the fMRI image data into latent representation and learn to predict the trajectory following differential equation.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The objective of this paper is to provide a temporal dynamic model for
resting state functional Magnetic Resonance Imaging (fMRI) trajectory to
predict future brain images based on the given sequence. To this end, we came
up with the model that takes advantage of representation learning and Neural
Ordinary Differential Equation (Neural ODE) to compress the fMRI image data
into latent representation and learn to predict the trajectory following
differential equation. Latent space was analyzed by Gaussian Mixture Model. The
learned fMRI trajectory embedding can be used to explain the variance of the
trajectory and predict human traits for each subject. This method achieves
average 0.5 spatial correlation for the whole predicted trajectory, and provide
trained ODE parameter for further analysis.
Related papers
- Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces a novel family of deep dynamical models designed to represent continuous-time sequence data.
We train the model using maximum likelihood estimation with Markov chain Monte Carlo.
Experiments on oscillating systems, videos and real-world state sequences (MuJoCo) illustrate that ODEs with the learnable energy-based prior outperform existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - Neural Ordinary Differential Equation based Sequential Image Registration for Dynamic Characterization [13.492983263194636]
This extension work discusses how this framework can aid in the characterization of sequential biological processes.
Our framework considers voxels as particles within a dynamic system, defining deformation fields through the integration of neural differential equations.
We evaluated our framework on two clinical datasets: one for cardiac motion tracking and another for longitudinal brain MRI analysis.
arXiv Detail & Related papers (2024-04-02T17:04:45Z) - Equivariant Graph Neural Operator for Modeling 3D Dynamics [148.98826858078556]
We propose Equivariant Graph Neural Operator (EGNO) to directly models dynamics as trajectories instead of just next-step prediction.
EGNO explicitly learns the temporal evolution of 3D dynamics where we formulate the dynamics as a function over time and learn neural operators to approximate it.
Comprehensive experiments in multiple domains, including particle simulations, human motion capture, and molecular dynamics, demonstrate the significantly superior performance of EGNO against existing methods.
arXiv Detail & Related papers (2024-01-19T21:50:32Z) - Neural Spherical Harmonics for structurally coherent continuous
representation of diffusion MRI signal [0.3277163122167433]
We present a novel way to model diffusion magnetic resonance imaging (dMRI) datasets, that benefits from the structural coherence of the human brain.
Current methods model the dMRI signal in individual voxels, disregarding the intervoxel coherence that is present.
We use a neural network to parameterize a spherical harmonics series to represent the dMRI signal of a single subject from the Human Connectome Project dataset.
arXiv Detail & Related papers (2023-08-16T08:28:01Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Mixed Effects Neural ODE: A Variational Approximation for Analyzing the
Dynamics of Panel Data [50.23363975709122]
We propose a probabilistic model called ME-NODE to incorporate (fixed + random) mixed effects for analyzing panel data.
We show that our model can be derived using smooth approximations of SDEs provided by the Wong-Zakai theorem.
We then derive Evidence Based Lower Bounds for ME-NODE, and develop (efficient) training algorithms.
arXiv Detail & Related papers (2022-02-18T22:41:51Z) - Moment evolution equations and moment matching for stochastic image
EPDiff [68.97335984455059]
Models of image deformation allow study of time-continuous effects transforming images by deforming the image domain.
Applications include medical image analysis with both population trends and random subject specific variation.
We use moment approximations of the corresponding Ito diffusion to construct estimators for statistical inference in the parameters full model.
arXiv Detail & Related papers (2021-10-07T11:08:11Z) - Latent linear dynamics in spatiotemporal medical data [0.0]
We present an unsupervised model that identifies the underlying dynamics of the system, only based on the sequential images.
The model maps the input to a low-dimensional latent space wherein a linear relationship holds between a hidden state process and the observed latent process.
Knowledge of the system dynamics enables denoising, imputation of missing values and extrapolation of future image frames.
arXiv Detail & Related papers (2021-03-01T11:42:21Z) - Stochasticity in Neural ODEs: An Empirical Study [68.8204255655161]
Regularization of neural networks (e.g. dropout) is a widespread technique in deep learning that allows for better generalization.
We show that data augmentation during the training improves the performance of both deterministic and versions of the same model.
However, the improvements obtained by the data augmentation completely eliminate the empirical regularization gains, making the performance of neural ODE and neural SDE negligible.
arXiv Detail & Related papers (2020-02-22T22:12:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.