AutoLoss-Zero: Searching Loss Functions from Scratch for Generic Tasks
- URL: http://arxiv.org/abs/2103.14026v1
- Date: Thu, 25 Mar 2021 17:59:09 GMT
- Title: AutoLoss-Zero: Searching Loss Functions from Scratch for Generic Tasks
- Authors: Hao Li, Tianwen Fu, Jifeng Dai, Hongsheng Li, Gao Huang, Xizhou Zhu
- Abstract summary: AutoLoss-Zero is the first framework for searching loss functions from scratch for generic tasks.
A loss-rejection protocol and a gradient-equivalence-check strategy are developed so as to improve the search efficiency.
Experiments on various computer vision tasks demonstrate that our searched loss functions are on par with or superior to existing loss functions.
- Score: 78.27036391638802
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Significant progress has been achieved in automating the design of various
components in deep networks. However, the automatic design of loss functions
for generic tasks with various evaluation metrics remains under-investigated.
Previous works on handcrafting loss functions heavily rely on human expertise,
which limits their extendibility. Meanwhile, existing efforts on searching loss
functions mainly focus on specific tasks and particular metrics, with
task-specific heuristics. Whether such works can be extended to generic tasks
is not verified and questionable. In this paper, we propose AutoLoss-Zero, the
first general framework for searching loss functions from scratch for generic
tasks. Specifically, we design an elementary search space composed only of
primitive mathematical operators to accommodate the heterogeneous tasks and
evaluation metrics. A variant of the evolutionary algorithm is employed to
discover loss functions in the elementary search space. A loss-rejection
protocol and a gradient-equivalence-check strategy are developed so as to
improve the search efficiency, which are applicable to generic tasks. Extensive
experiments on various computer vision tasks demonstrate that our searched loss
functions are on par with or superior to existing loss functions, which
generalize well to different datasets and networks. Code shall be released.
Related papers
- Fast and Efficient Local Search for Genetic Programming Based Loss
Function Learning [12.581217671500887]
We propose a new meta-learning framework for task and model-agnostic loss function learning via a hybrid search approach.
Results show that the learned loss functions bring improved convergence, sample efficiency, and inference performance on tabulated, computer vision, and natural language processing problems.
arXiv Detail & Related papers (2024-03-01T02:20:04Z) - A survey and taxonomy of loss functions in machine learning [51.35995529962554]
We present a comprehensive overview of the most widely used loss functions across key applications, including regression, classification, generative modeling, ranking, and energy-based modeling.
We introduce 43 distinct loss functions, structured within an intuitive taxonomy that clarifies their theoretical foundations, properties, and optimal application contexts.
arXiv Detail & Related papers (2023-01-13T14:38:24Z) - Reinforcement Learning with Automated Auxiliary Loss Search [34.83123677004838]
We propose a principled and universal method for learning better representations with auxiliary loss functions.
Specifically, we define a general auxiliary loss space of size $7.5 times 1020$ and explore the space with an efficient evolutionary search strategy.
Empirical results show that the discovered auxiliary loss significantly improves the performance on both high-dimensional (image) and low-dimensional (vector) unseen tasks.
arXiv Detail & Related papers (2022-10-12T09:24:53Z) - AutoGPart: Intermediate Supervision Search for Generalizable 3D Part
Segmentation [58.78094823473567]
AutoGPart builds a supervision space with geometric prior knowledge encoded, and lets the machine to search for the optimal supervisions for a specific segmentation task automatically.
We demonstrate that the performance of segmentation networks using simple backbones can be significantly improved when trained with supervisions searched by our method.
arXiv Detail & Related papers (2022-03-13T03:45:58Z) - Meta-Learning with Task-Adaptive Loss Function for Few-Shot Learning [50.59295648948287]
In few-shot learning scenarios, the challenge is to generalize and perform well on new unseen examples.
We introduce a new meta-learning framework with a loss function that adapts to each task.
Our proposed framework, named Meta-Learning with Task-Adaptive Loss Function (MeTAL), demonstrates the effectiveness and the flexibility across various domains.
arXiv Detail & Related papers (2021-10-08T06:07:21Z) - Loss Function Discovery for Object Detection via Convergence-Simulation
Driven Search [101.73248560009124]
We propose an effective convergence-simulation driven evolutionary search algorithm, CSE-Autoloss, for speeding up the search progress.
We conduct extensive evaluations of loss function search on popular detectors and validate the good generalization capability of searched losses.
Our experiments show that the best-discovered loss function combinations outperform default combinations by 1.1% and 0.8% in terms of mAP for two-stage and one-stage detectors.
arXiv Detail & Related papers (2021-02-09T08:34:52Z) - Auto Seg-Loss: Searching Metric Surrogates for Semantic Segmentation [56.343646789922545]
We propose to automate the design of metric-specific loss functions by searching differentiable surrogate losses for each metric.
Experiments on PASCAL VOC and Cityscapes demonstrate that the searched surrogate losses outperform the manually designed loss functions consistently.
arXiv Detail & Related papers (2020-10-15T17:59:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.