HyperTendril: Visual Analytics for User-Driven Hyperparameter
Optimization of Deep Neural Networks
- URL: http://arxiv.org/abs/2009.02078v2
- Date: Fri, 18 Sep 2020 08:29:56 GMT
- Title: HyperTendril: Visual Analytics for User-Driven Hyperparameter
Optimization of Deep Neural Networks
- Authors: Heungseok Park, Yoonsoo Nam, Ji-Hoon Kim, Jaegul Choo
- Abstract summary: HyperTendril is a web-based visual analytics system that supports user-driven hyperparameter tuning processes.
We show how HyperTendril helps users steer their tuning processes via a longitudinal user study based on the analysis of interaction logs and in-depth interviews while we deploy our system in a professional industrial environment.
- Score: 36.047441272704205
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: To mitigate the pain of manually tuning hyperparameters of deep neural
networks, automated machine learning (AutoML) methods have been developed to
search for an optimal set of hyperparameters in large combinatorial search
spaces. However, the search results of AutoML methods significantly depend on
initial configurations, making it a non-trivial task to find a proper
configuration. Therefore, human intervention via a visual analytic approach
bears huge potential in this task. In response, we propose HyperTendril, a
web-based visual analytics system that supports user-driven hyperparameter
tuning processes in a model-agnostic environment. HyperTendril takes a novel
approach to effectively steering hyperparameter optimization through an
iterative, interactive tuning procedure that allows users to refine the search
spaces and the configuration of the AutoML method based on their own insights
from given results. Using HyperTendril, users can obtain insights into the
complex behaviors of various hyperparameter search algorithms and diagnose
their configurations. In addition, HyperTendril supports variable importance
analysis to help the users refine their search spaces based on the analysis of
relative importance of different hyperparameters and their interaction effects.
We present the evaluation demonstrating how HyperTendril helps users steer
their tuning processes via a longitudinal user study based on the analysis of
interaction logs and in-depth interviews while we deploy our system in a
professional industrial environment.
Related papers
- Hyperparameter Optimization in Machine Learning [34.356747514732966]
Hyperparameters are configuration variables controlling the behavior of machine learning algorithms.
The choice of their values determine the effectiveness of systems based on these technologies.
We present a unified treatment of hyperparameter optimization, providing the reader with examples and insights into the state-of-the-art.
arXiv Detail & Related papers (2024-10-30T09:39:22Z) - Efficient Hyperparameter Importance Assessment for CNNs [1.7778609937758323]
This paper aims to quantify the importance weights of some hyperparameters in Convolutional Neural Networks (CNNs) with an algorithm called N-RReliefF.
We conduct an extensive study by training over ten thousand CNN models across ten popular image classification datasets.
arXiv Detail & Related papers (2024-10-11T15:47:46Z) - AutoRL Hyperparameter Landscapes [69.15927869840918]
Reinforcement Learning (RL) has shown to be capable of producing impressive results, but its use is limited by the impact of its hyperparameters on performance.
We propose an approach to build and analyze these hyperparameter landscapes not just for one point in time but at multiple points in time throughout training.
This supports the theory that hyperparameters should be dynamically adjusted during training and shows the potential for more insights on AutoRL problems that can be gained through landscape analyses.
arXiv Detail & Related papers (2023-04-05T12:14:41Z) - Hyper-Parameter Auto-Tuning for Sparse Bayesian Learning [72.83293818245978]
We design and learn a neural network (NN)-based auto-tuner for hyper- parameter tuning in sparse Bayesian learning.
We show that considerable improvement in convergence rate and recovery performance can be achieved.
arXiv Detail & Related papers (2022-11-09T12:34:59Z) - Goal-Oriented Sensitivity Analysis of Hyperparameters in Deep Learning [0.0]
We study the use of goal-oriented sensitivity analysis, based on the Hilbert-Schmidt Independence Criterion (HSIC), for hyperparameter analysis and optimization.
We derive an HSIC-based optimization algorithm that we apply on MNIST and Cifar, classical machine learning data sets, of interest for scientific machine learning.
arXiv Detail & Related papers (2022-07-13T14:21:12Z) - AUTOMATA: Gradient Based Data Subset Selection for Compute-Efficient
Hyper-parameter Tuning [72.54359545547904]
We propose a gradient-based subset selection framework for hyper- parameter tuning.
We show that using gradient-based data subsets for hyper- parameter tuning achieves significantly faster turnaround times and speedups of 3$times$-30$times$.
arXiv Detail & Related papers (2022-03-15T19:25:01Z) - HyperNP: Interactive Visual Exploration of Multidimensional Projection
Hyperparameters [61.354362652006834]
HyperNP is a scalable method that allows for real-time interactive exploration of projection methods by training neural network approximations.
We evaluate the performance of the HyperNP across three datasets in terms of performance and speed.
arXiv Detail & Related papers (2021-06-25T17:28:14Z) - Guided Hyperparameter Tuning Through Visualization and Inference [12.035299005299306]
We present a streamlined visualization system enabling deep learning practitioners to more efficiently explore, tune, and optimize hyper parameters.
A key idea is to directly suggest more optimal hyper parameters using a predictive mechanism.
We evaluate the tool with a user study on deep learning model builders, finding that our participants have little issue adopting the tool and working with it as part of their workflow.
arXiv Detail & Related papers (2021-05-24T19:55:24Z) - HyperSTAR: Task-Aware Hyperparameters for Deep Networks [52.50861379908611]
HyperSTAR is a task-aware method to warm-start HPO for deep neural networks.
It learns a dataset (task) representation along with the performance predictor directly from raw images.
It evaluates 50% less configurations to achieve the best performance compared to existing methods.
arXiv Detail & Related papers (2020-05-21T08:56:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.