NPBDREG: A Non-parametric Bayesian Deep-Learning Based Approach for
Diffeomorphic Brain MRI Registration
- URL: http://arxiv.org/abs/2108.06771v1
- Date: Sun, 15 Aug 2021 16:00:06 GMT
- Title: NPBDREG: A Non-parametric Bayesian Deep-Learning Based Approach for
Diffeomorphic Brain MRI Registration
- Authors: Samah Khawaled, Moti Freiman
- Abstract summary: NPBDREG is a non-optimal framework for unsupervised deformable image registration.
It provides improved uncertainty estimates and confidence measures in a theoretically well-parametric and computationally efficient way.
It shows a slight improvement in the registration accuracy compared to PrVXM.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Quantification of uncertainty in deep-neural-networks (DNN) based image
registration algorithms plays an important role in the safe deployment of
real-world medical applications and research-oriented processing pipelines, and
in improving generalization capabilities. Currently available approaches for
uncertainty estimation, including the variational encoder-decoder architecture
and the inference-time dropout approach, require specific network architectures
and assume parametric distribution of the latent space which may result in
sub-optimal characterization of the posterior distribution for the predicted
deformation-fields. We introduce the NPBDREG, a fully non-parametric Bayesian
framework for unsupervised DNN-based deformable image registration by combining
an \texttt{Adam} optimizer with stochastic gradient Langevin dynamics (SGLD) to
characterize the true posterior distribution through posterior sampling. The
NPBDREG provides a principled non-parametric way to characterize the true
posterior distribution, thus providing improved uncertainty estimates and
confidence measures in a theoretically well-founded and computationally
efficient way. We demonstrated the added-value of NPBDREG, compared to the
baseline probabilistic \texttt{VoxelMorph} unsupervised model (PrVXM), on brain
MRI images registration using $390$ image pairs from four publicly available
databases: MGH10, CMUC12, ISBR18 and LPBA40. The NPBDREG shows a slight
improvement in the registration accuracy compared to PrVXM (Dice score of
$0.73$ vs. $0.68$, $p \ll 0.01$), a better generalization capability for data
corrupted by a mixed structure noise (e.g Dice score of $0.729$ vs. $0.686$ for
$\alpha=0.2$) and last but foremost, a significantly better correlation of the
predicted uncertainty with out-of-distribution data ($r>0.95$ vs. $r<0.5$).
Related papers
- Rényi Neural Processes [14.11793373584558]
We propose R'enyi Neural Processes (RNP) to ameliorate the impacts of prior misspecification.
We scale the density ratio $fracpq$ by the power of (1-$alpha$) in the divergence gradients with respect to the posterior.
Our experiments show consistent log-likelihood improvements over state-of-the-art NP family models.
arXiv Detail & Related papers (2024-05-25T00:14:55Z) - NPB-REC: A Non-parametric Bayesian Deep-learning Approach for Undersampled MRI Reconstruction with Uncertainty Estimation [2.6089354079273512]
"NPB-REC" is a non-parametric framework for MRI reconstruction from undersampled data with uncertainty estimation.
We use Gradient Langevin Dynamics during training to characterize the posterior distribution of the network parameters.
Our approach outperforms the baseline in terms of reconstruction accuracy by means of PSNR and SSIM.
arXiv Detail & Related papers (2024-04-06T08:25:33Z) - Constraining cosmological parameters from N-body simulations with
Variational Bayesian Neural Networks [0.0]
Multiplicative normalizing flows (MNFs) are a family of approximate posteriors for the parameters of BNNs.
We have compared MNFs with respect to the standard BNNs, and the flipout estimator.
MNFs provide more realistic predictive distribution closer to the true posterior mitigating the bias introduced by the variational approximation.
arXiv Detail & Related papers (2023-01-09T16:07:48Z) - DR-DSGD: A Distributionally Robust Decentralized Learning Algorithm over
Graphs [54.08445874064361]
We propose to solve a regularized distributionally robust learning problem in the decentralized setting.
By adding a Kullback-Liebler regularization function to the robust min-max optimization problem, the learning problem can be reduced to a modified robust problem.
We show that our proposed algorithm can improve the worst distribution test accuracy by up to $10%$.
arXiv Detail & Related papers (2022-08-29T18:01:42Z) - NPB-REC: Non-parametric Assessment of Uncertainty in Deep-learning-based
MRI Reconstruction from Undersampled Data [0.0]
Uncertainty quantification in deep-learning (DL) based image reconstruction models is critical for reliable clinical decision making.
We introduce "NPB-REC", a non-parametric framework for uncertainty assessment in MRI reconstruction from undersampled "k-space" data.
arXiv Detail & Related papers (2022-08-08T08:22:25Z) - Normalized/Clipped SGD with Perturbation for Differentially Private
Non-Convex Optimization [94.06564567766475]
DP-SGD and DP-NSGD mitigate the risk of large models memorizing sensitive training data.
We show that these two algorithms achieve similar best accuracy while DP-NSGD is comparatively easier to tune than DP-SGD.
arXiv Detail & Related papers (2022-06-27T03:45:02Z) - Inverting brain grey matter models with likelihood-free inference: a
tool for trustable cytoarchitecture measurements [62.997667081978825]
characterisation of the brain grey matter cytoarchitecture with quantitative sensitivity to soma density and volume remains an unsolved challenge in dMRI.
We propose a new forward model, specifically a new system of equations, requiring a few relatively sparse b-shells.
We then apply modern tools from Bayesian analysis known as likelihood-free inference (LFI) to invert our proposed model.
arXiv Detail & Related papers (2021-11-15T09:08:27Z) - Test-time Batch Statistics Calibration for Covariate Shift [66.7044675981449]
We propose to adapt the deep models to the novel environment during inference.
We present a general formulation $alpha$-BN to calibrate the batch statistics.
We also present a novel loss function to form a unified test time adaptation framework Core.
arXiv Detail & Related papers (2021-10-06T08:45:03Z) - What Are Bayesian Neural Network Posteriors Really Like? [63.950151520585024]
We show that Hamiltonian Monte Carlo can achieve significant performance gains over standard and deep ensembles.
We also show that deep distributions are similarly close to HMC as standard SGLD, and closer than standard variational inference.
arXiv Detail & Related papers (2021-04-29T15:38:46Z) - Unsupervised Deep-Learning Based Deformable Image Registration: A
Bayesian Framework [0.0]
We introduce a fully Bayesian framework for unsupervised DL-based deformable image registration.
Our method provides a way to characterize the true posterior distribution, thus, avoiding potential over-fitting.
Our approach provides better estimates of the deformation field by means of improved mean-squared-error.
arXiv Detail & Related papers (2020-08-10T08:15:49Z) - Unlabelled Data Improves Bayesian Uncertainty Calibration under
Covariate Shift [100.52588638477862]
We develop an approximate Bayesian inference scheme based on posterior regularisation.
We demonstrate the utility of our method in the context of transferring prognostic models of prostate cancer across globally diverse populations.
arXiv Detail & Related papers (2020-06-26T13:50:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.