PyTorch Hyperparameter Tuning - A Tutorial for spotPython
- URL: http://arxiv.org/abs/2305.11930v2
- Date: Wed, 7 Jun 2023 14:25:19 GMT
- Title: PyTorch Hyperparameter Tuning - A Tutorial for spotPython
- Authors: Thomas Bartz-Beielstein
- Abstract summary: This tutorial includes a brief comparison with Ray Tune, a Python library for running experiments and tuning hyperparameters.
We show that spotPython achieves similar or even better results while being more flexible and transparent than Ray Tune.
- Score: 0.20305676256390928
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: The goal of hyperparameter tuning (or hyperparameter optimization) is to
optimize the hyperparameters to improve the performance of the machine or deep
learning model. spotPython (``Sequential Parameter Optimization Toolbox in
Python'') is the Python version of the well-known hyperparameter tuner SPOT,
which has been developed in the R programming environment for statistical
analysis for over a decade. PyTorch is an optimized tensor library for deep
learning using GPUs and CPUs. This document shows how to integrate the
spotPython hyperparameter tuner into the PyTorch training workflow. As an
example, the results of the CIFAR10 image classifier are used. In addition to
an introduction to spotPython, this tutorial also includes a brief comparison
with Ray Tune, a Python library for running experiments and tuning
hyperparameters. This comparison is based on the PyTorch hyperparameter tuning
tutorial. The advantages and disadvantages of both approaches are discussed. We
show that spotPython achieves similar or even better results while being more
flexible and transparent than Ray Tune.
Related papers
- depyf: Open the Opaque Box of PyTorch Compiler for Machine Learning Researchers [92.13613958373628]
textttdepyf is a tool designed to demystify the inner workings of the PyTorch compiler.
textttdepyf decompiles bytecode generated by PyTorch back into equivalent source code.
arXiv Detail & Related papers (2024-03-14T16:17:14Z) - Hyperparameter Tuning Cookbook: A guide for scikit-learn, PyTorch,
river, and spotPython [0.20305676256390928]
This document provides a guide to hyperparameter tuning using spotPython for scikit-learn, PyTorch, and river.
With a hands-on approach and step-by-step explanations, this cookbook serves as a practical starting point.
arXiv Detail & Related papers (2023-07-17T16:20:27Z) - Python Tool for Visualizing Variability of Pareto Fronts over Multiple
Runs [1.370633147306388]
We develop a Python package for empirical attainment surface.
The package is available at https://github.com/nabenabe0928/empirical-attainment-func.
arXiv Detail & Related papers (2023-05-15T17:59:34Z) - PyHopper -- Hyperparameter optimization [51.40201315676902]
We present PyHopper, a black-box optimization platform for machine learning researchers.
PyHopper's goal is to integrate with existing code with minimal effort and run the optimization process with minimal necessary manual oversight.
With simplicity as the primary theme, PyHopper is powered by a single robust Markov-chain Monte-Carlo optimization algorithm.
arXiv Detail & Related papers (2022-10-10T14:35:01Z) - PyHHMM: A Python Library for Heterogeneous Hidden Markov Models [63.01207205641885]
PyHHMM is an object-oriented Python implementation of Heterogeneous-Hidden Markov Models (HHMMs)
PyHHMM emphasizes features not supported in similar available frameworks: a heterogeneous observation model, missing data inference, different model order selection criterias, and semi-supervised training.
PyHHMM relies on the numpy, scipy, scikit-learn, and seaborn Python packages, and is distributed under the Apache-2.0 License.
arXiv Detail & Related papers (2022-01-12T07:32:36Z) - HyperNP: Interactive Visual Exploration of Multidimensional Projection
Hyperparameters [61.354362652006834]
HyperNP is a scalable method that allows for real-time interactive exploration of projection methods by training neural network approximations.
We evaluate the performance of the HyperNP across three datasets in terms of performance and speed.
arXiv Detail & Related papers (2021-06-25T17:28:14Z) - Surrogate Model Based Hyperparameter Tuning for Deep Learning with SPOT [0.40611352512781856]
This article demonstrates how the architecture-level parameters of deep learning models that were implemented in Keras/tensorflow can be optimized.
The implementation of the tuning procedure is 100 % based on R, the software environment for statistical computing.
arXiv Detail & Related papers (2021-05-30T21:16:51Z) - Using Python for Model Inference in Deep Learning [0.6027358520885614]
We show how it is possible to meet performance and packaging constraints while performing inference in Python.
We present a way of using multiple Python interpreters within a single process to achieve scalable inference.
arXiv Detail & Related papers (2021-04-01T04:48:52Z) - PHS: A Toolbox for Parallel Hyperparameter Search [2.0305676256390934]
We introduce an open source python framework named PHS - Parallel Hyperparameter Search.
It enables hyperparameter optimization on numerous compute instances of any arbitrary python function.
arXiv Detail & Related papers (2020-02-26T12:17:54Z) - MOGPTK: The Multi-Output Gaussian Process Toolkit [71.08576457371433]
We present MOGPTK, a Python package for multi-channel data modelling using Gaussian processes (GP)
The aim of this toolkit is to make multi-output GP (MOGP) models accessible to researchers, data scientists, and practitioners alike.
arXiv Detail & Related papers (2020-02-09T23:34:49Z) - OPFython: A Python-Inspired Optimum-Path Forest Classifier [68.8204255655161]
This paper proposes a Python-based Optimum-Path Forest framework, denoted as OPFython.
As OPFython is a Python-based library, it provides a more friendly environment and a faster prototyping workspace than the C language.
arXiv Detail & Related papers (2020-01-28T15:46:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.