The ensmallen library for flexible numerical optimization
- URL: http://arxiv.org/abs/2108.12981v2
- Date: Fri, 9 Feb 2024 13:07:00 GMT
- Title: The ensmallen library for flexible numerical optimization
- Authors: Ryan R. Curtin, Marcus Edel, Rahul Ganesh Prabhu, Suryoday Basak,
Zhihao Lou, Conrad Sanderson
- Abstract summary: We overview the ensmallen numerical optimization library, which provides a flexible C++ framework for mathematical optimization of user-supplied objective functions.
Many types of objective functions are supported, including general, differentiable, separable, constrained, and categorical.
- Score: 15.78308411537254
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We overview the ensmallen numerical optimization library, which provides a
flexible C++ framework for mathematical optimization of user-supplied objective
functions. Many types of objective functions are supported, including general,
differentiable, separable, constrained, and categorical. A diverse set of
pre-built optimizers is provided, including Quasi-Newton optimizers and many
variants of Stochastic Gradient Descent. The underlying framework facilitates
the implementation of new optimizers. Optimization of an objective function
typically requires supplying only one or two C++ functions. Custom behavior can
be easily specified via callback functions. Empirical comparisons show that
ensmallen outperforms other frameworks while providing more functionality. The
library is available at https://ensmallen.org and is distributed under the
permissive BSD license.
Related papers
- CompilerDream: Learning a Compiler World Model for General Code Optimization [58.87557583347996]
We introduce CompilerDream, a model-based reinforcement learning approach to general code optimization.
It comprises a compiler world model that accurately simulates the intrinsic properties of optimization passes and an agent trained on this model to produce effective optimization strategies.
It excels across diverse datasets, surpassing LLVM's built-in optimizations and other state-of-the-art methods in both settings of value prediction and end-to-end code optimization.
arXiv Detail & Related papers (2024-04-24T09:20:33Z) - A General Framework for User-Guided Bayesian Optimization [51.96352579696041]
We propose ColaBO, the first Bayesian-principled framework for prior beliefs beyond the typical kernel structure.
We empirically demonstrate ColaBO's ability to substantially accelerate optimization when the prior information is accurate, and to retain approximately default performance when it is misleading.
arXiv Detail & Related papers (2023-11-24T18:27:26Z) - NUBO: A Transparent Python Package for Bayesian Optimization [0.0]
NUBO is a framework for optimizing black-box functions, such as physical experiments and computer simulators.
It focuses on transparency and user experience to make Bayesian optimization accessible to researchers from all disciplines.
NUBO is written in Python but does not require expert knowledge of Python to optimize simulators and experiments.
arXiv Detail & Related papers (2023-05-11T10:34:27Z) - Theseus: A Library for Differentiable Nonlinear Optimization [21.993680737841476]
Theseus is an efficient application-agnostic library for differentiable nonlinear least squares (DNLS) optimization built on PyTorch.
Theseus provides a common framework for end-to-end structured learning in robotics and vision.
arXiv Detail & Related papers (2022-07-19T17:57:40Z) - On the development of a Bayesian optimisation framework for complex
unknown systems [11.066706766632578]
This paper studies and compares common Bayesian optimisation algorithms empirically on a range of synthetic test functions.
It investigates the choice of acquisition function and number of training samples, exact calculation of acquisition functions and Monte Carlo based approaches.
arXiv Detail & Related papers (2022-07-19T09:50:34Z) - A General Recipe for Likelihood-free Bayesian Optimization [115.82591413062546]
We propose likelihood-free BO (LFBO) to extend BO to a broader class of models and utilities.
LFBO directly models the acquisition function without having to separately perform inference with a probabilistic surrogate model.
We show that computing the acquisition function in LFBO can be reduced to optimizing a weighted classification problem.
arXiv Detail & Related papers (2022-06-27T03:55:27Z) - Differentiable Spline Approximations [48.10988598845873]
Differentiable programming has significantly enhanced the scope of machine learning.
Standard differentiable programming methods (such as autodiff) typically require that the machine learning models be differentiable.
We show that leveraging this redesigned Jacobian in the form of a differentiable "layer" in predictive models leads to improved performance in diverse applications.
arXiv Detail & Related papers (2021-10-04T16:04:46Z) - Are we Forgetting about Compositional Optimisers in Bayesian
Optimisation? [66.39551991177542]
This paper presents a sample methodology for global optimisation.
Within this, a crucial performance-determiningtrivial is maximising the acquisition function.
We highlight the empirical advantages of the approach to optimise functionation across 3958 individual experiments.
arXiv Detail & Related papers (2020-12-15T12:18:38Z) - Incorporating Expert Prior in Bayesian Optimisation via Space Warping [54.412024556499254]
In big search spaces the algorithm goes through several low function value regions before reaching the optimum of the function.
One approach to subside this cold start phase is to use prior knowledge that can accelerate the optimisation.
In this paper, we represent the prior knowledge about the function optimum through a prior distribution.
The prior distribution is then used to warp the search space in such a way that space gets expanded around the high probability region of function optimum and shrinks around low probability region of optimum.
arXiv Detail & Related papers (2020-03-27T06:18:49Z) - Flexible numerical optimization with ensmallen [15.78308411537254]
This report provides an introduction to the ensmallen numerical optimization library.
The library provides a fast and flexible C++ framework for mathematical optimization of arbitrary user-supplied functions.
arXiv Detail & Related papers (2020-03-09T12:57:42Z) - Practical Bayesian Optimization of Objectives with Conditioning
Variables [1.0497128347190048]
We consider the more general case where a user is faced with multiple problems that each need to be optimized conditional on a state variable.
Similarity across objectives boosts optimization of each objective in two ways.
We propose a framework for conditional optimization: ConBO.
arXiv Detail & Related papers (2020-02-23T22:06:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.