Uncertainty quantification in non-rigid image registration via
stochastic gradient Markov chain Monte Carlo
- URL: http://arxiv.org/abs/2110.13289v1
- Date: Mon, 25 Oct 2021 22:05:20 GMT
- Title: Uncertainty quantification in non-rigid image registration via
stochastic gradient Markov chain Monte Carlo
- Authors: Daniel Grzech, Mohammad Farid Azampour, Huaqi Qiu, Ben Glocker,
Bernhard Kainz, Lo\"ic Le Folgoc
- Abstract summary: We develop a new Bayesian model for non-rigid registration of three-dimensional medical images, with a focus on uncertainty quantification.
We explore connections between the Markov chain Monte Carlo by backpropagation and the variational inference by backpropagation frameworks.
We compare the model in terms of both image registration accuracy and uncertainty quantification to VoxelMorph, a state-of-the-art image registration model based on deep learning.
- Score: 14.296549721024736
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We develop a new Bayesian model for non-rigid registration of
three-dimensional medical images, with a focus on uncertainty quantification.
Probabilistic registration of large images with calibrated uncertainty
estimates is difficult for both computational and modelling reasons. To address
the computational issues, we explore connections between the Markov chain Monte
Carlo by backpropagation and the variational inference by backpropagation
frameworks, in order to efficiently draw samples from the posterior
distribution of transformation parameters. To address the modelling issues, we
formulate a Bayesian model for image registration that overcomes the existing
barriers when using a dense, high-dimensional, and diffeomorphic transformation
parametrisation. This results in improved calibration of uncertainty estimates.
We compare the model in terms of both image registration accuracy and
uncertainty quantification to VoxelMorph, a state-of-the-art image registration
model based on deep learning.
Related papers
- Hierarchical uncertainty estimation for learning-based registration in neuroimaging [10.964653898591413]
We propose a principled way to propagate uncertainties (epistemic or aleatoric) estimated at the level of spatial location.
Experiments show that uncertainty-aware fitting of transformations improves the registration accuracy of brain MRI scans.
arXiv Detail & Related papers (2024-10-11T23:12:16Z) - Calibrated Cache Model for Few-Shot Vision-Language Model Adaptation [36.45488536471859]
Similarity refines the image-image similarity by using unlabeled images.
Weight introduces a precision matrix into the weight function to adequately model the relation between training samples.
To reduce the high complexity of GPs, we propose a group-based learning strategy.
arXiv Detail & Related papers (2024-10-11T15:12:30Z) - On the Quantification of Image Reconstruction Uncertainty without
Training Data [5.057039869893053]
We propose a deep variational framework that leverages a deep generative model to learn an approximate posterior distribution.
We parameterize the target posterior using a flow-based model and minimize their Kullback-Leibler (KL) divergence to achieve accurate uncertainty estimation.
Our results indicate that our method provides reliable and high-quality image reconstruction with robust uncertainty estimation.
arXiv Detail & Related papers (2023-11-16T07:46:47Z) - Regularized Vector Quantization for Tokenized Image Synthesis [126.96880843754066]
Quantizing images into discrete representations has been a fundamental problem in unified generative modeling.
deterministic quantization suffers from severe codebook collapse and misalignment with inference stage while quantization suffers from low codebook utilization and reconstruction objective.
This paper presents a regularized vector quantization framework that allows to mitigate perturbed above issues effectively by applying regularization from two perspectives.
arXiv Detail & Related papers (2023-03-11T15:20:54Z) - Masked Images Are Counterfactual Samples for Robust Fine-tuning [77.82348472169335]
Fine-tuning deep learning models can lead to a trade-off between in-distribution (ID) performance and out-of-distribution (OOD) robustness.
We propose a novel fine-tuning method, which uses masked images as counterfactual samples that help improve the robustness of the fine-tuning model.
arXiv Detail & Related papers (2023-03-06T11:51:28Z) - Theoretical characterization of uncertainty in high-dimensional linear
classification [24.073221004661427]
We show that uncertainty for learning from limited number of samples of high-dimensional input data and labels can be obtained by the approximate message passing algorithm.
We discuss how over-confidence can be mitigated by appropriately regularising, and show that cross-validating with respect to the loss leads to better calibration than with the 0/1 error.
arXiv Detail & Related papers (2022-02-07T15:32:07Z) - A Model for Multi-View Residual Covariances based on Perspective
Deformation [88.21738020902411]
We derive a model for the covariance of the visual residuals in multi-view SfM, odometry and SLAM setups.
We validate our model with synthetic and real data and integrate it into photometric and feature-based Bundle Adjustment.
arXiv Detail & Related papers (2022-02-01T21:21:56Z) - Moment evolution equations and moment matching for stochastic image
EPDiff [68.97335984455059]
Models of image deformation allow study of time-continuous effects transforming images by deforming the image domain.
Applications include medical image analysis with both population trends and random subject specific variation.
We use moment approximations of the corresponding Ito diffusion to construct estimators for statistical inference in the parameters full model.
arXiv Detail & Related papers (2021-10-07T11:08:11Z) - PDC-Net+: Enhanced Probabilistic Dense Correspondence Network [161.76275845530964]
Enhanced Probabilistic Dense Correspondence Network, PDC-Net+, capable of estimating accurate dense correspondences.
We develop an architecture and an enhanced training strategy tailored for robust and generalizable uncertainty prediction.
Our approach obtains state-of-the-art results on multiple challenging geometric matching and optical flow datasets.
arXiv Detail & Related papers (2021-09-28T17:56:41Z) - Training on Test Data with Bayesian Adaptation for Covariate Shift [96.3250517412545]
Deep neural networks often make inaccurate predictions with unreliable uncertainty estimates.
We derive a Bayesian model that provides for a well-defined relationship between unlabeled inputs under distributional shift and model parameters.
We show that our method improves both accuracy and uncertainty estimation.
arXiv Detail & Related papers (2021-09-27T01:09:08Z) - Uncertainty-aware Generalized Adaptive CycleGAN [44.34422859532988]
Unpaired image-to-image translation refers to learning inter-image-domain mapping in an unsupervised manner.
Existing methods often learn deterministic mappings without explicitly modelling the robustness to outliers or predictive uncertainty.
We propose a novel probabilistic method called Uncertainty-aware Generalized Adaptive Cycle Consistency (UGAC)
arXiv Detail & Related papers (2021-02-23T15:22:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.