A survey and taxonomy of loss functions in machine learning
- URL: http://arxiv.org/abs/2301.05579v1
- Date: Fri, 13 Jan 2023 14:38:24 GMT
- Title: A survey and taxonomy of loss functions in machine learning
- Authors: Lorenzo Ciampiconi, Adam Elwood, Marco Leonardi, Ashraf Mohamed,
Alessandro Rozza
- Abstract summary: Most state-of-the-art machine learning techniques revolve around the optimisation of loss functions.
This survey aims to provide a reference of the most essential loss functions for both beginner and advanced machine learning practitioners.
- Score: 60.41650195728953
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Most state-of-the-art machine learning techniques revolve around the
optimisation of loss functions. Defining appropriate loss functions is
therefore critical to successfully solving problems in this field. We present a
survey of the most commonly used loss functions for a wide range of different
applications, divided into classification, regression, ranking, sample
generation and energy based modelling. Overall, we introduce 33 different loss
functions and we organise them into an intuitive taxonomy. Each loss function
is given a theoretical backing and we describe where it is best used. This
survey aims to provide a reference of the most essential loss functions for
both beginner and advanced machine learning practitioners.
Related papers
- On the Dynamics Under the Unhinged Loss and Beyond [104.49565602940699]
We introduce the unhinged loss, a concise loss function, that offers more mathematical opportunities to analyze closed-form dynamics.
The unhinged loss allows for considering more practical techniques, such as time-vary learning rates and feature normalization.
arXiv Detail & Related papers (2023-12-13T02:11:07Z) - Loss Functions and Metrics in Deep Learning [0.0]
We provide a comprehensive overview of the most common loss functions and metrics used across many different types of deep learning tasks.
We introduce the formula for each loss and metric, discuss their strengths and limitations, and describe how these methods can be applied to various problems within deep learning.
arXiv Detail & Related papers (2023-07-05T23:53:55Z) - Online Loss Function Learning [13.744076477599707]
Loss function learning aims to automate the task of designing a loss function for a machine learning model.
We propose a new loss function learning technique for adaptively updating the loss function online after each update to the base model parameters.
arXiv Detail & Related papers (2023-01-30T19:22:46Z) - Xtreme Margin: A Tunable Loss Function for Binary Classification
Problems [0.0]
We provide an overview of a novel loss function, the Xtreme Margin loss function.
Unlike the binary cross-entropy and the hinge loss functions, this loss function provides researchers and practitioners flexibility with their training process.
arXiv Detail & Related papers (2022-10-31T22:39:32Z) - Evaluating the Impact of Loss Function Variation in Deep Learning for
Classification [0.0]
The loss function is arguably among the most important hyper parameters for a neural network.
We consider deep neural networks in a supervised classification setting and analyze the impact the choice of loss function has onto the training result.
While certain loss functions perform suboptimally, our work empirically shows that under-represented losses can outperform the State-of-the-Art choices significantly.
arXiv Detail & Related papers (2022-10-28T09:10:10Z) - A Survey of Learning Criteria Going Beyond the Usual Risk [7.335712499936906]
"Good performance" is typically stated in terms of a sufficiently small average loss, taken over the random draw of test data.
While optimizing for performance on average is intuitive, convenient to analyze in theory, and easy to implement in practice, such a choice brings about trade-offs.
arXiv Detail & Related papers (2021-10-11T04:35:33Z) - Meta-Learning with Task-Adaptive Loss Function for Few-Shot Learning [50.59295648948287]
In few-shot learning scenarios, the challenge is to generalize and perform well on new unseen examples.
We introduce a new meta-learning framework with a loss function that adapts to each task.
Our proposed framework, named Meta-Learning with Task-Adaptive Loss Function (MeTAL), demonstrates the effectiveness and the flexibility across various domains.
arXiv Detail & Related papers (2021-10-08T06:07:21Z) - AutoLoss-Zero: Searching Loss Functions from Scratch for Generic Tasks [78.27036391638802]
AutoLoss-Zero is the first framework for searching loss functions from scratch for generic tasks.
A loss-rejection protocol and a gradient-equivalence-check strategy are developed so as to improve the search efficiency.
Experiments on various computer vision tasks demonstrate that our searched loss functions are on par with or superior to existing loss functions.
arXiv Detail & Related papers (2021-03-25T17:59:09Z) - Loss Function Discovery for Object Detection via Convergence-Simulation
Driven Search [101.73248560009124]
We propose an effective convergence-simulation driven evolutionary search algorithm, CSE-Autoloss, for speeding up the search progress.
We conduct extensive evaluations of loss function search on popular detectors and validate the good generalization capability of searched losses.
Our experiments show that the best-discovered loss function combinations outperform default combinations by 1.1% and 0.8% in terms of mAP for two-stage and one-stage detectors.
arXiv Detail & Related papers (2021-02-09T08:34:52Z) - Auto Seg-Loss: Searching Metric Surrogates for Semantic Segmentation [56.343646789922545]
We propose to automate the design of metric-specific loss functions by searching differentiable surrogate losses for each metric.
Experiments on PASCAL VOC and Cityscapes demonstrate that the searched surrogate losses outperform the manually designed loss functions consistently.
arXiv Detail & Related papers (2020-10-15T17:59:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.