CurricularFace: Adaptive Curriculum Learning Loss for Deep Face
Recognition
- URL: http://arxiv.org/abs/2004.00288v1
- Date: Wed, 1 Apr 2020 08:43:10 GMT
- Title: CurricularFace: Adaptive Curriculum Learning Loss for Deep Face
Recognition
- Authors: Yuge Huang, Yuhan Wang, Ying Tai, Xiaoming Liu, Pengcheng Shen,
Shaoxin Li, Jilin Li, Feiyue Huang
- Abstract summary: We propose a novel Adaptive Curriculum Learning loss (CurricularFace) that embeds the idea of curriculum learning into the loss function.
Our CurricularFace adaptively adjusts the relative importance of easy and hard samples during different training stages.
- Score: 79.92240030758575
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: As an emerging topic in face recognition, designing margin-based loss
functions can increase the feature margin between different classes for
enhanced discriminability. More recently, the idea of mining-based strategies
is adopted to emphasize the misclassified samples, achieving promising results.
However, during the entire training process, the prior methods either do not
explicitly emphasize the sample based on its importance that renders the hard
samples not fully exploited; or explicitly emphasize the effects of
semi-hard/hard samples even at the early training stage that may lead to
convergence issue. In this work, we propose a novel Adaptive Curriculum
Learning loss (CurricularFace) that embeds the idea of curriculum learning into
the loss function to achieve a novel training strategy for deep face
recognition, which mainly addresses easy samples in the early training stage
and hard ones in the later stage. Specifically, our CurricularFace adaptively
adjusts the relative importance of easy and hard samples during different
training stages. In each stage, different samples are assigned with different
importance according to their corresponding difficultness. Extensive
experimental results on popular benchmarks demonstrate the superiority of our
CurricularFace over the state-of-the-art competitors.
Related papers
- Preview-based Category Contrastive Learning for Knowledge Distillation [53.551002781828146]
We propose a novel preview-based category contrastive learning method for knowledge distillation (PCKD)
It first distills the structural knowledge of both instance-level feature correspondence and the relation between instance features and category centers.
It can explicitly optimize the category representation and explore the distinct correlation between representations of instances and categories.
arXiv Detail & Related papers (2024-10-18T03:31:00Z) - Fair Few-shot Learning with Auxiliary Sets [53.30014767684218]
In many machine learning (ML) tasks, only very few labeled data samples can be collected, which can lead to inferior fairness performance.
In this paper, we define the fairness-aware learning task with limited training samples as the emphfair few-shot learning problem.
We devise a novel framework that accumulates fairness-aware knowledge across different meta-training tasks and then generalizes the learned knowledge to meta-test tasks.
arXiv Detail & Related papers (2023-08-28T06:31:37Z) - DiscrimLoss: A Universal Loss for Hard Samples and Incorrect Samples
Discrimination [28.599571524763785]
Given data with label noise (i.e., incorrect data), deep neural networks would gradually memorize the label noise and impair model performance.
To relieve this issue, curriculum learning is proposed to improve model performance and generalization by ordering training samples in a meaningful sequence.
arXiv Detail & Related papers (2022-08-21T13:38:55Z) - A Framework using Contrastive Learning for Classification with Noisy
Labels [1.2891210250935146]
We propose a framework using contrastive learning as a pre-training task to perform image classification in the presence of noisy labels.
Recent strategies such as pseudo-labeling, sample selection with Gaussian Mixture models, weighted supervised contrastive learning have been combined into a fine-tuning phase following the pre-training.
arXiv Detail & Related papers (2021-04-19T18:51:22Z) - Cross-Domain Similarity Learning for Face Recognition in Unseen Domains [90.35908506994365]
We introduce a novel cross-domain metric learning loss, which we dub Cross-Domain Triplet (CDT) loss, to improve face recognition in unseen domains.
The CDT loss encourages learning semantically meaningful features by enforcing compact feature clusters of identities from one domain.
Our method does not require careful hard-pair sample mining and filtering strategy during training.
arXiv Detail & Related papers (2021-03-12T19:48:01Z) - Dynamic Sampling for Deep Metric Learning [7.010669841466896]
Deep metric learning maps visually similar images onto nearby locations and visually dissimilar images apart from each other in an embedding manifold.
A dynamic sampling strategy is proposed to organize the training pairs in an easy-to-hard order to feed into the network.
It allows the network to learn general boundaries between categories from the easy training pairs at its early stages and finalize the details of the model mainly relying on the hard training samples in the later.
arXiv Detail & Related papers (2020-04-24T09:47:23Z) - Towards Universal Representation Learning for Deep Face Recognition [106.21744671876704]
We propose a universal representation learning framework that can deal with larger variation unseen in the given training data without leveraging target domain knowledge.
Experiments show that our method achieves top performance on general face recognition datasets such as LFW and MegaFace.
arXiv Detail & Related papers (2020-02-26T23:29:57Z) - Improving Face Recognition from Hard Samples via Distribution
Distillation Loss [131.61036519863856]
Large facial variations are the main challenge in face recognition.
We propose a novel Distribution Distillation Loss to narrow the performance gap between easy and hard samples.
We have conducted extensive experiments on both generic large-scale face benchmarks and benchmarks with diverse variations on race, resolution and pose.
arXiv Detail & Related papers (2020-02-10T11:25:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.