Deep Implicit Optimization for Robust and Flexible Image Registration
- URL: http://arxiv.org/abs/2406.07361v2
- Date: Fri, 18 Oct 2024 14:38:03 GMT
- Title: Deep Implicit Optimization for Robust and Flexible Image Registration
- Authors: Rohit Jena, Pratik Chaudhari, James C. Gee,
- Abstract summary: We bridge the gap between classical and learning methods by incorporating optimization as a layer in a deep network.
By implicitly differentiating end-to-end through an iterative optimization solver, our learned features are registration and label-aware.
Our framework shows excellent performance on in-domain datasets, and is agnostic to domain shift.
- Score: 20.34181966545357
- License:
- Abstract: Deep Learning in Image Registration (DLIR) methods have been tremendously successful in image registration due to their speed and ability to incorporate weak label supervision at training time. However, DLIR methods forego many of the benefits of classical optimization-based methods. The functional nature of deep networks do not guarantee that the predicted transformation is a local minima of the registration objective, the representation of the transformation (displacement/velocity field/affine) is fixed, and the networks are not robust to domain shift. Our method aims to bridge this gap between classical and learning methods by incorporating optimization as a layer in a deep network. A deep network is trained to predict multi-scale dense feature images that are registered using a black box iterative optimization solver. This optimal warp is then used to minimize image and label alignment errors. By implicitly differentiating end-to-end through an iterative optimization solver, our learned features are registration and label-aware, and the warp functions are guaranteed to be local minima of the registration objective in the feature space. Our framework shows excellent performance on in-domain datasets, and is agnostic to domain shift such as anisotropy and varying intensity profiles. For the first time, our method allows switching between arbitrary transformation representations (free-form to diffeomorphic) at test time with zero retraining. End-to-end feature learning also facilitates interpretability of features, and out-of-the-box promptability using additional label-fidelity terms at inference.
Related papers
- Improving Instance Optimization in Deformable Image Registration with Gradient Projection [7.6061804149819885]
Deformable image registration is inherently a multi-objective optimization problem.
These conflicting objectives often lead to poor optimization outcomes.
Deep learning methods have recently gained popularity in this domain due to their efficiency in processing large datasets.
arXiv Detail & Related papers (2024-10-21T08:27:13Z) - LeRF: Learning Resampling Function for Adaptive and Efficient Image Interpolation [64.34935748707673]
Recent deep neural networks (DNNs) have made impressive progress in performance by introducing learned data priors.
We propose a novel method of Learning Resampling (termed LeRF) which takes advantage of both the structural priors learned by DNNs and the locally continuous assumption.
LeRF assigns spatially varying resampling functions to input image pixels and learns to predict the shapes of these resampling functions with a neural network.
arXiv Detail & Related papers (2024-07-13T16:09:45Z) - Parameter Hierarchical Optimization for Visible-Infrared Person Re-Identification [0.6675805308519986]
Visible-infrared person re-identification (VI-reID) aims at matching cross-modality pedestrian images captured by disjoint visible or infrared cameras.
We propose a novel parameter optimizing paradigm, parameter hierarchical optimization (PHO) method, for the task of VI-ReID.
It allows part of parameters to be directly optimized without any training, which narrows the search space of parameters and makes the whole network more easier to be trained.
arXiv Detail & Related papers (2024-04-11T17:27:39Z) - End-to-End Diffusion Latent Optimization Improves Classifier Guidance [81.27364542975235]
Direct Optimization of Diffusion Latents (DOODL) is a novel guidance method.
It enables plug-and-play guidance by optimizing diffusion latents.
It outperforms one-step classifier guidance on computational and human evaluation metrics.
arXiv Detail & Related papers (2023-03-23T22:43:52Z) - Object Representations as Fixed Points: Training Iterative Refinement
Algorithms with Implicit Differentiation [88.14365009076907]
Iterative refinement is a useful paradigm for representation learning.
We develop an implicit differentiation approach that improves the stability and tractability of training.
arXiv Detail & Related papers (2022-07-02T10:00:35Z) - Non-iterative Coarse-to-fine Registration based on Single-pass Deep
Cumulative Learning [11.795108660250843]
We propose a Non-Iterative Coarse-to-finE registration network (NICE-Net) for deformable image registration.
NICE-Net can outperform state-of-the-art iterative deep registration methods while only requiring similar runtime to non-iterative methods.
arXiv Detail & Related papers (2022-06-25T08:34:59Z) - Large-scale Optimization of Partial AUC in a Range of False Positive
Rates [51.12047280149546]
The area under the ROC curve (AUC) is one of the most widely used performance measures for classification models in machine learning.
We develop an efficient approximated gradient descent method based on recent practical envelope smoothing technique.
Our proposed algorithm can also be used to minimize the sum of some ranked range loss, which also lacks efficient solvers.
arXiv Detail & Related papers (2022-03-03T03:46:18Z) - Implicit Optimizer for Diffeomorphic Image Registration [3.1970342304563037]
We propose a rapid and accurate Implicit for Diffeomorphic Image Registration (IDIR) which utilizes the Deep Implicit Function as the neural velocity field.
We evaluate our proposed method on two 3D large-scale MR brain scan datasets, the results show that our proposed method provides faster and better registration results than conventional image registration approaches.
arXiv Detail & Related papers (2022-02-25T05:04:29Z) - Domain Adversarial Training: A Game Perspective [80.3821370633883]
This paper defines optimal solutions in domain-adversarial training from a game theoretical perspective.
We show that descent in domain-adversarial training can violate the convergence guarantees of the gradient, oftentimes hindering the transfer performance.
Ours are easy to implement, free of additional parameters, and can be plugged into any domain-adversarial framework.
arXiv Detail & Related papers (2022-02-10T22:17:30Z) - Neural Non-Rigid Tracking [26.41847163649205]
We introduce a novel, end-to-end learnable, differentiable non-rigid tracker.
We employ a convolutional neural network to predict dense correspondences and their confidences.
Compared to state-of-the-art approaches, our algorithm shows improved reconstruction performance.
arXiv Detail & Related papers (2020-06-23T18:00:39Z) - Learning Deformable Image Registration from Optimization: Perspective,
Modules, Bilevel Training and Beyond [62.730497582218284]
We develop a new deep learning based framework to optimize a diffeomorphic model via multi-scale propagation.
We conduct two groups of image registration experiments on 3D volume datasets including image-to-atlas registration on brain MRI data and image-to-image registration on liver CT data.
arXiv Detail & Related papers (2020-04-30T03:23:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.