Latent Distribution Adjusting for Face Anti-Spoofing
- URL: http://arxiv.org/abs/2305.09285v1
- Date: Tue, 16 May 2023 08:43:14 GMT
- Title: Latent Distribution Adjusting for Face Anti-Spoofing
- Authors: Qinghong Sun, Zhenfei Yin, Yichao Wu, Yuanhan Zhang, Jing Shao
- Abstract summary: We propose a unified framework called Latent Distribution Adjusting (LDA) to improve the robustness of the face anti-spoofing (FAS) model.
To enhance the intra-class compactness and inter-class discrepancy, we propose a margin-based loss for providing distribution constrains for prototype learning.
Our framework can 1) make the final representation space both intra-class compact and inter-class separable, 2) outperform the state-of-the-art methods on multiple standard FAS benchmarks.
- Score: 29.204168516602568
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: With the development of deep learning, the field of face anti-spoofing (FAS)
has witnessed great progress. FAS is usually considered a classification
problem, where each class is assumed to contain a single cluster optimized by
softmax loss. In practical deployment, one class can contain several local
clusters, and a single-center is insufficient to capture the inherent structure
of the FAS data. However, few approaches consider large distribution
discrepancies in the field of FAS. In this work, we propose a unified framework
called Latent Distribution Adjusting (LDA) with properties of latent,
discriminative, adaptive, generic to improve the robustness of the FAS model by
adjusting complex data distribution with multiple prototypes. 1) Latent. LDA
attempts to model the data of each class as a Gaussian mixture distribution,
and acquire a flexible number of centers for each class in the last fully
connected layer implicitly. 2) Discriminative. To enhance the intra-class
compactness and inter-class discrepancy, we propose a margin-based loss for
providing distribution constrains for prototype learning. 3) Adaptive. To make
LDA more efficient and decrease redundant parameters, we propose Adaptive
Prototype Selection (APS) by selecting the appropriate number of centers
adaptively according to different distributions. 4) Generic. Furthermore, LDA
can adapt to unseen distribution by utilizing very few training data without
re-training. Extensive experiments demonstrate that our framework can 1) make
the final representation space both intra-class compact and inter-class
separable, 2) outperform the state-of-the-art methods on multiple standard FAS
benchmarks.
Related papers
- Adaptive Margin Global Classifier for Exemplar-Free Class-Incremental Learning [3.4069627091757178]
Existing methods mainly focus on handling biased learning.
We introduce a Distribution-Based Global (DBGC) to avoid bias factors in existing methods, such as data imbalance and sampling.
More importantly, the compromised distributions of old classes are simulated via a simple operation, variance (VE).
This loss is proven equivalent to an Adaptive Margin Softmax Cross Entropy (AMarX)
arXiv Detail & Related papers (2024-09-20T07:07:23Z) - Prototype Fission: Closing Set for Robust Open-set Semi-supervised
Learning [6.645479471664253]
Semi-supervised Learning (SSL) has been proven vulnerable to out-of-distribution (OOD) samples in realistic large-scale unsupervised datasets.
We propose Prototype Fission(PF) to divide class-wise latent spaces into compact sub-spaces by automatic fine-grained latent space mining.
arXiv Detail & Related papers (2023-08-29T19:04:42Z) - Balanced Classification: A Unified Framework for Long-Tailed Object
Detection [74.94216414011326]
Conventional detectors suffer from performance degradation when dealing with long-tailed data due to a classification bias towards the majority head categories.
We introduce a unified framework called BAlanced CLassification (BACL), which enables adaptive rectification of inequalities caused by disparities in category distribution.
BACL consistently achieves performance improvements across various datasets with different backbones and architectures.
arXiv Detail & Related papers (2023-08-04T09:11:07Z) - Learnable Distribution Calibration for Few-Shot Class-Incremental
Learning [122.2241120474278]
Few-shot class-incremental learning (FSCIL) faces challenges of memorizing old class distributions and estimating new class distributions given few training samples.
We propose a learnable distribution calibration (LDC) approach, with the aim to systematically solve these two challenges using a unified framework.
arXiv Detail & Related papers (2022-10-01T09:40:26Z) - Powering Finetuning in Few-shot Learning: Domain-Agnostic Feature
Adaptation with Rectified Class Prototypes [32.622613524622075]
Finetuning is designed to focus on reducing biases in novel-class feature distributions.
By powering finetuning with DCM and SS, we achieve state-of-the-art results on Meta-Dataset.
arXiv Detail & Related papers (2022-04-07T21:29:12Z) - BMD: A General Class-balanced Multicentric Dynamic Prototype Strategy
for Source-free Domain Adaptation [74.93176783541332]
Source-free Domain Adaptation (SFDA) aims to adapt a pre-trained source model to the unlabeled target domain without accessing the well-labeled source data.
To make up for the absence of source data, most existing methods introduced feature prototype based pseudo-labeling strategies.
We propose a general class-Balanced Multicentric Dynamic prototype strategy for the SFDA task.
arXiv Detail & Related papers (2022-04-06T13:23:02Z) - Semi-supervised Domain Adaptive Structure Learning [72.01544419893628]
Semi-supervised domain adaptation (SSDA) is a challenging problem requiring methods to overcome both 1) overfitting towards poorly annotated data and 2) distribution shift across domains.
We introduce an adaptive structure learning method to regularize the cooperation of SSL and DA.
arXiv Detail & Related papers (2021-12-12T06:11:16Z) - Self-Weighted Robust LDA for Multiclass Classification with Edge Classes [111.5515086563592]
A novel self-weighted robust LDA with l21-norm based between-class distance criterion, called SWRLDA, is proposed for multi-class classification.
The proposed SWRLDA is easy to implement, and converges fast in practice.
arXiv Detail & Related papers (2020-09-24T12:32:55Z) - Generalized Zero-Shot Learning Via Over-Complete Distribution [79.5140590952889]
We propose to generate an Over-Complete Distribution (OCD) using Conditional Variational Autoencoder (CVAE) of both seen and unseen classes.
The effectiveness of the framework is evaluated using both Zero-Shot Learning and Generalized Zero-Shot Learning protocols.
arXiv Detail & Related papers (2020-04-01T19:05:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.