Loss Function Search for Face Recognition
- URL: http://arxiv.org/abs/2007.06542v1
- Date: Fri, 10 Jul 2020 03:40:10 GMT
- Title: Loss Function Search for Face Recognition
- Authors: Xiaobo Wang, Shuo Wang, Cheng Chi, Shifeng Zhang, Tao Mei
- Abstract summary: We develop a reward-guided search method to automatically obtain the best candidate.
Experimental results on a variety of face recognition benchmarks have demonstrated the effectiveness of our method.
- Score: 75.79325080027908
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In face recognition, designing margin-based (e.g., angular, additive,
additive angular margins) softmax loss functions plays an important role in
learning discriminative features. However, these hand-crafted heuristic methods
are sub-optimal because they require much effort to explore the large design
space. Recently, an AutoML for loss function search method AM-LFS has been
derived, which leverages reinforcement learning to search loss functions during
the training process. But its search space is complex and unstable that
hindering its superiority. In this paper, we first analyze that the key to
enhance the feature discrimination is actually \textbf{how to reduce the
softmax probability}. We then design a unified formulation for the current
margin-based softmax losses. Accordingly, we define a novel search space and
develop a reward-guided search method to automatically obtain the best
candidate. Experimental results on a variety of face recognition benchmarks
have demonstrated the effectiveness of our method over the state-of-the-art
alternatives.
Related papers
- SubFace: Learning with Softmax Approximation for Face Recognition [3.262192371833866]
SubFace is a softmax approximation method that employs the subspace feature to promote the performance of face recognition.
Comprehensive experiments conducted on benchmark datasets demonstrate that our method can significantly improve the performance of vanilla CNN baseline.
arXiv Detail & Related papers (2022-08-24T12:31:08Z) - Learning Towards the Largest Margins [83.7763875464011]
Loss function should promote the largest possible margins for both classes and samples.
Not only does this principled framework offer new perspectives to understand and interpret existing margin-based losses, but it can guide the design of new tools.
arXiv Detail & Related papers (2022-06-23T10:03:03Z) - MURAL: Meta-Learning Uncertainty-Aware Rewards for Outcome-Driven
Reinforcement Learning [65.52675802289775]
We show that an uncertainty aware classifier can solve challenging reinforcement learning problems.
We propose a novel method for computing the normalized maximum likelihood (NML) distribution.
We show that the resulting algorithm has a number of intriguing connections to both count-based exploration methods and prior algorithms for learning reward functions.
arXiv Detail & Related papers (2021-07-15T08:19:57Z) - Hierarchical Deep CNN Feature Set-Based Representation Learning for
Robust Cross-Resolution Face Recognition [59.29808528182607]
Cross-resolution face recognition (CRFR) is important in intelligent surveillance and biometric forensics.
Existing shallow learning-based and deep learning-based methods focus on mapping the HR-LR face pairs into a joint feature space.
In this study, we desire to fully exploit the multi-level deep convolutional neural network (CNN) feature set for robust CRFR.
arXiv Detail & Related papers (2021-03-25T14:03:42Z) - Frequency-aware Discriminative Feature Learning Supervised by
Single-Center Loss for Face Forgery Detection [89.43987367139724]
Face forgery detection is raising ever-increasing interest in computer vision.
Recent works have reached sound achievements, but there are still unignorable problems.
A novel frequency-aware discriminative feature learning framework is proposed in this paper.
arXiv Detail & Related papers (2021-03-16T14:17:17Z) - Loss Function Discovery for Object Detection via Convergence-Simulation
Driven Search [101.73248560009124]
We propose an effective convergence-simulation driven evolutionary search algorithm, CSE-Autoloss, for speeding up the search progress.
We conduct extensive evaluations of loss function search on popular detectors and validate the good generalization capability of searched losses.
Our experiments show that the best-discovered loss function combinations outperform default combinations by 1.1% and 0.8% in terms of mAP for two-stage and one-stage detectors.
arXiv Detail & Related papers (2021-02-09T08:34:52Z) - More Information Supervised Probabilistic Deep Face Embedding Learning [10.52667214402514]
We analyse margin based softmax loss in probability view.
An auto-encoder architecture called Linear-Auto-TS-Encoder(LATSE) is proposed to corroborate this finding.
arXiv Detail & Related papers (2020-06-08T12:33:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.