Learning the Effect of Registration Hyperparameters with HyperMorph
- URL: http://arxiv.org/abs/2203.16680v1
- Date: Wed, 30 Mar 2022 21:30:06 GMT
- Title: Learning the Effect of Registration Hyperparameters with HyperMorph
- Authors: Andrew Hoopes, Malte Hoffmann, Douglas N. Greve, Bruce Fischl, John
Guttag, Adrian V. Dalca
- Abstract summary: We introduce HyperMorph, a framework that facilitates efficient hyperparameter tuning in learning-based deformable image registration.
We show that it enables fast, high-resolution hyperparameter search at test-time, reducing the inefficiency of traditional approaches.
- Score: 7.313453912494172
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce HyperMorph, a framework that facilitates efficient
hyperparameter tuning in learning-based deformable image registration.
Classical registration algorithms perform an iterative pair-wise optimization
to compute a deformation field that aligns two images. Recent learning-based
approaches leverage large image datasets to learn a function that rapidly
estimates a deformation for a given image pair. In both strategies, the
accuracy of the resulting spatial correspondences is strongly influenced by the
choice of certain hyperparameter values. However, an effective hyperparameter
search consumes substantial time and human effort as it often involves training
multiple models for different fixed hyperparameter values and may lead to
suboptimal registration. We propose an amortized hyperparameter learning
strategy to alleviate this burden by learning the impact of hyperparameters on
deformation fields. We design a meta network, or hypernetwork, that predicts
the parameters of a registration network for input hyperparameters, thereby
comprising a single model that generates the optimal deformation field
corresponding to given hyperparameter values. This strategy enables fast,
high-resolution hyperparameter search at test-time, reducing the inefficiency
of traditional approaches while increasing flexibility. We also demonstrate
additional benefits of HyperMorph, including enhanced robustness to model
initialization and the ability to rapidly identify optimal hyperparameter
values specific to a dataset, image contrast, task, or even anatomical region,
all without the need to retrain models. We make our code publicly available at
http://hypermorph.voxelmorph.net.
Related papers
- Efficient Hyperparameter Importance Assessment for CNNs [1.7778609937758323]
This paper aims to quantify the importance weights of some hyperparameters in Convolutional Neural Networks (CNNs) with an algorithm called N-RReliefF.
We conduct an extensive study by training over ten thousand CNN models across ten popular image classification datasets.
arXiv Detail & Related papers (2024-10-11T15:47:46Z) - Optimization Hyper-parameter Laws for Large Language Models [56.322914260197734]
We present Opt-Laws, a framework that captures the relationship between hyper- parameters and training outcomes.
Our validation across diverse model sizes and data scales demonstrates Opt-Laws' ability to accurately predict training loss.
This approach significantly reduces computational costs while enhancing overall model performance.
arXiv Detail & Related papers (2024-09-07T09:37:19Z) - ETHER: Efficient Finetuning of Large-Scale Models with Hyperplane Reflections [59.839926875976225]
We propose the ETHER transformation family, which performs Efficient fineTuning via HypErplane Reflections.
In particular, we introduce ETHER and its relaxation ETHER+, which match or outperform existing PEFT methods with significantly fewer parameters.
arXiv Detail & Related papers (2024-05-30T17:26:02Z) - HyperPredict: Estimating Hyperparameter Effects for Instance-Specific Regularization in Deformable Image Registration [2.2252684361733293]
Methods for medical image registration infer geometric transformations that align pairs/groups of images by maximising an image similarity metric.
Regularization terms are essential to obtain meaningful registration results.
We propose a method for evaluating the influence of hyper parameters and subsequently selecting an optimal value for given image pairs.
arXiv Detail & Related papers (2024-03-04T14:17:30Z) - AUTOMATA: Gradient Based Data Subset Selection for Compute-Efficient
Hyper-parameter Tuning [72.54359545547904]
We propose a gradient-based subset selection framework for hyper- parameter tuning.
We show that using gradient-based data subsets for hyper- parameter tuning achieves significantly faster turnaround times and speedups of 3$times$-30$times$.
arXiv Detail & Related papers (2022-03-15T19:25:01Z) - HyperNP: Interactive Visual Exploration of Multidimensional Projection
Hyperparameters [61.354362652006834]
HyperNP is a scalable method that allows for real-time interactive exploration of projection methods by training neural network approximations.
We evaluate the performance of the HyperNP across three datasets in terms of performance and speed.
arXiv Detail & Related papers (2021-06-25T17:28:14Z) - Conditional Deformable Image Registration with Convolutional Neural
Network [15.83842747998493]
We propose a conditional image registration method and a new self-supervised learning paradigm for deep deformable image registration.
Our proposed method enables the precise control of the smoothness of the deformation field without sacrificing the runtime advantage or registration accuracy.
arXiv Detail & Related papers (2021-06-23T22:25:28Z) - Optimizing Large-Scale Hyperparameters via Automated Learning Algorithm [97.66038345864095]
We propose a new hyperparameter optimization method with zeroth-order hyper-gradients (HOZOG)
Specifically, we first formulate hyperparameter optimization as an A-based constrained optimization problem.
Then, we use the average zeroth-order hyper-gradients to update hyper parameters.
arXiv Detail & Related papers (2021-02-17T21:03:05Z) - HyperMorph: Amortized Hyperparameter Learning for Image Registration [8.13669868327082]
HyperMorph is a learning-based strategy for deformable image registration.
We show that it can be used to optimize multiple hyper parameters considerably faster than existing search strategies.
arXiv Detail & Related papers (2021-01-04T15:39:16Z) - Automatic Hyper-Parameter Optimization Based on Mapping Discovery from
Data to Hyper-Parameters [3.37314595161109]
We propose an efficient automatic parameter optimization approach, which is based on the mapping from data to the corresponding hyper- parameters.
We show that the proposed approaches outperform the state-of-the-art apporaches significantly.
arXiv Detail & Related papers (2020-03-03T19:26:23Z) - Rethinking the Hyperparameters for Fine-tuning [78.15505286781293]
Fine-tuning from pre-trained ImageNet models has become the de-facto standard for various computer vision tasks.
Current practices for fine-tuning typically involve selecting an ad-hoc choice of hyper parameters.
This paper re-examines several common practices of setting hyper parameters for fine-tuning.
arXiv Detail & Related papers (2020-02-19T18:59:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.