A survey and taxonomy of loss functions in machine learning
- URL: http://arxiv.org/abs/2301.05579v2
- Date: Mon, 18 Nov 2024 12:01:29 GMT
- Title: A survey and taxonomy of loss functions in machine learning
- Authors: Lorenzo Ciampiconi, Adam Elwood, Marco Leonardi, Ashraf Mohamed, Alessandro Rozza,
- Abstract summary: We present a comprehensive overview of the most widely used loss functions across key applications, including regression, classification, generative modeling, ranking, and energy-based modeling.
We introduce 43 distinct loss functions, structured within an intuitive taxonomy that clarifies their theoretical foundations, properties, and optimal application contexts.
- Score: 51.35995529962554
- License:
- Abstract: Most state-of-the-art machine learning techniques revolve around the optimisation of loss functions. Defining appropriate loss functions is therefore critical to successfully solving problems in this field. In this survey, we present a comprehensive overview of the most widely used loss functions across key applications, including regression, classification, generative modeling, ranking, and energy-based modeling. We introduce 43 distinct loss functions, structured within an intuitive taxonomy that clarifies their theoretical foundations, properties, and optimal application contexts. This survey is intended as a resource for undergraduate, graduate, and Ph.D. students, as well as researchers seeking a deeper understanding of loss functions.
Related papers
- Fast and Efficient Local Search for Genetic Programming Based Loss
Function Learning [12.581217671500887]
We propose a new meta-learning framework for task and model-agnostic loss function learning via a hybrid search approach.
Results show that the learned loss functions bring improved convergence, sample efficiency, and inference performance on tabulated, computer vision, and natural language processing problems.
arXiv Detail & Related papers (2024-03-01T02:20:04Z) - On the Dynamics Under the Unhinged Loss and Beyond [104.49565602940699]
We introduce the unhinged loss, a concise loss function, that offers more mathematical opportunities to analyze closed-form dynamics.
The unhinged loss allows for considering more practical techniques, such as time-vary learning rates and feature normalization.
arXiv Detail & Related papers (2023-12-13T02:11:07Z) - Loss Functions in the Era of Semantic Segmentation: A Survey and Outlook [11.119967679567587]
Loss functions are crucial for shaping the development of deep learning-based segmentation algorithms.
We provide a novel taxonomy and review of how these loss functions are customized and leveraged in image segmentation.
We conclude this review by identifying current challenges and unveiling future research opportunities.
arXiv Detail & Related papers (2023-12-08T22:06:05Z) - RoBoSS: A Robust, Bounded, Sparse, and Smooth Loss Function for
Supervised Learning [0.0]
We propose a novel robust, bounded, sparse, and smooth (RoBoSS) loss function for supervised learning.
We introduce a new robust algorithm named $mathcalL_rbss$-SVM to generalize well to unseen data.
We evaluate the proposed $mathcalL_rbss$-SVM on $88$ real-world UCI and KEEL datasets from diverse domains.
arXiv Detail & Related papers (2023-09-05T13:59:50Z) - Loss Functions and Metrics in Deep Learning [0.0]
We provide a comprehensive overview of the most common loss functions and metrics used across many different types of deep learning tasks.
We introduce the formula for each loss and metric, discuss their strengths and limitations, and describe how these methods can be applied to various problems within deep learning.
arXiv Detail & Related papers (2023-07-05T23:53:55Z) - Xtreme Margin: A Tunable Loss Function for Binary Classification
Problems [0.0]
We provide an overview of a novel loss function, the Xtreme Margin loss function.
Unlike the binary cross-entropy and the hinge loss functions, this loss function provides researchers and practitioners flexibility with their training process.
arXiv Detail & Related papers (2022-10-31T22:39:32Z) - AutoLoss-Zero: Searching Loss Functions from Scratch for Generic Tasks [78.27036391638802]
AutoLoss-Zero is the first framework for searching loss functions from scratch for generic tasks.
A loss-rejection protocol and a gradient-equivalence-check strategy are developed so as to improve the search efficiency.
Experiments on various computer vision tasks demonstrate that our searched loss functions are on par with or superior to existing loss functions.
arXiv Detail & Related papers (2021-03-25T17:59:09Z) - Loss Function Discovery for Object Detection via Convergence-Simulation
Driven Search [101.73248560009124]
We propose an effective convergence-simulation driven evolutionary search algorithm, CSE-Autoloss, for speeding up the search progress.
We conduct extensive evaluations of loss function search on popular detectors and validate the good generalization capability of searched losses.
Our experiments show that the best-discovered loss function combinations outperform default combinations by 1.1% and 0.8% in terms of mAP for two-stage and one-stage detectors.
arXiv Detail & Related papers (2021-02-09T08:34:52Z) - Auto Seg-Loss: Searching Metric Surrogates for Semantic Segmentation [56.343646789922545]
We propose to automate the design of metric-specific loss functions by searching differentiable surrogate losses for each metric.
Experiments on PASCAL VOC and Cityscapes demonstrate that the searched surrogate losses outperform the manually designed loss functions consistently.
arXiv Detail & Related papers (2020-10-15T17:59:08Z) - Loss Function Search for Face Recognition [75.79325080027908]
We develop a reward-guided search method to automatically obtain the best candidate.
Experimental results on a variety of face recognition benchmarks have demonstrated the effectiveness of our method.
arXiv Detail & Related papers (2020-07-10T03:40:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.