Python Tool for Visualizing Variability of Pareto Fronts over Multiple
Runs
- URL: http://arxiv.org/abs/2305.08852v1
- Date: Mon, 15 May 2023 17:59:34 GMT
- Title: Python Tool for Visualizing Variability of Pareto Fronts over Multiple
Runs
- Authors: Shuhei Watanabe
- Abstract summary: We develop a Python package for empirical attainment surface.
The package is available at https://github.com/nabenabe0928/empirical-attainment-func.
- Score: 1.370633147306388
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Hyperparameter optimization is crucial to achieving high performance in deep
learning. On top of the performance, other criteria such as inference time or
memory requirement often need to be optimized due to some practical reasons.
This motivates research on multi-objective optimization (MOO). However, Pareto
fronts of MOO methods are often shown without considering the variability
caused by random seeds and this makes the performance stability evaluation
difficult. Although there is a concept named empirical attainment surface to
enable the visualization with uncertainty over multiple runs, there is no major
Python package for empirical attainment surface. We, therefore, develop a
Python package for this purpose and describe the usage. The package is
available at https://github.com/nabenabe0928/empirical-attainment-func.
Related papers
- Python is Not Always the Best Choice: Embracing Multilingual Program of Thoughts [51.49688654641581]
We propose a task and model agnostic approach called MultiPoT, which harnesses strength and diversity from various languages.
Experimental results reveal that it significantly outperforms Python Self-Consistency.
In particular, MultiPoT achieves more than 4.6% improvement on average on ChatGPT (gpt-3.5-turbo-0701)
arXiv Detail & Related papers (2024-02-16T13:48:06Z) - PyBADS: Fast and robust black-box optimization in Python [11.4219428942199]
PyBADS is an implementation of the Adaptive Direct Search (BADS) algorithm for fast and robust black-box optimization.
It comes along with an easy-to-use Python interface for running the algorithm for running the results.
arXiv Detail & Related papers (2023-06-27T15:54:44Z) - Parameter-efficient Tuning of Large-scale Multimodal Foundation Model [68.24510810095802]
We propose A graceful prompt framework for cross-modal transfer (Aurora) to overcome these challenges.
Considering the redundancy in existing architectures, we first utilize the mode approximation to generate 0.1M trainable parameters to implement the multimodal prompt tuning.
A thorough evaluation on six cross-modal benchmarks shows that it not only outperforms the state-of-the-art but even outperforms the full fine-tuning approach.
arXiv Detail & Related papers (2023-05-15T06:40:56Z) - Pre-training helps Bayesian optimization too [49.28382118032923]
We seek an alternative practice for setting functional priors.
In particular, we consider the scenario where we have data from similar functions that allow us to pre-train a tighter distribution a priori.
Our results show that our method is able to locate good hyper parameters at least 3 times more efficiently than the best competing methods.
arXiv Detail & Related papers (2022-07-07T04:42:54Z) - PyEPO: A PyTorch-based End-to-End Predict-then-Optimize Library for
Linear and Integer Programming [9.764407462807588]
We present the PyEPO package, a PyTorchbased end-to-end predict-then-optimize library in Python.
PyEPO is the first such generic tool for linear and integer programming with predicted objective function coefficients.
arXiv Detail & Related papers (2022-06-28T18:33:55Z) - DADApy: Distance-based Analysis of DAta-manifolds in Python [51.37841707191944]
DADApy is a python software package for analysing and characterising high-dimensional data.
It provides methods for estimating the intrinsic dimension and the probability density, for performing density-based clustering and for comparing different distance metrics.
arXiv Detail & Related papers (2022-05-04T08:41:59Z) - pysamoo: Surrogate-Assisted Multi-Objective Optimization in Python [7.8140593450932965]
pysamoo is a proposed framework for solving computationally expensive optimization problems.
pysamoo provides multiple optimization methods for handling problems involving time-consuming evaluation functions.
For more information about pysamoo, readers are encouraged to visit: anyoptimization.com/projects/pysamoo.
arXiv Detail & Related papers (2022-04-12T14:55:57Z) - Rethinking the Hyperparameters for Fine-tuning [78.15505286781293]
Fine-tuning from pre-trained ImageNet models has become the de-facto standard for various computer vision tasks.
Current practices for fine-tuning typically involve selecting an ad-hoc choice of hyper parameters.
This paper re-examines several common practices of setting hyper parameters for fine-tuning.
arXiv Detail & Related papers (2020-02-19T18:59:52Z) - OPFython: A Python-Inspired Optimum-Path Forest Classifier [68.8204255655161]
This paper proposes a Python-based Optimum-Path Forest framework, denoted as OPFython.
As OPFython is a Python-based library, it provides a more friendly environment and a faster prototyping workspace than the C language.
arXiv Detail & Related papers (2020-01-28T15:46:19Z) - pymoo: Multi-objective Optimization in Python [7.8140593450932965]
We have developed pymoo, a multi-objective optimization framework in Python.
We provide a guide to getting started with our framework by demonstrating the implementation of an exemplary constrained multi-objective optimization scenario.
The implementations in our framework are customizable and algorithms can be modified/extended by supplying custom operators.
arXiv Detail & Related papers (2020-01-22T16:04:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.