Low-Resolution Face Recognition via Adaptable Instance-Relation Distillation
- URL: http://arxiv.org/abs/2409.02049v1
- Date: Tue, 3 Sep 2024 16:53:34 GMT
- Title: Low-Resolution Face Recognition via Adaptable Instance-Relation Distillation
- Authors: Ruixin Shi, Weijia Guo, Shiming Ge,
- Abstract summary: Low-resolution face recognition is a challenging task due to the missing of informative details.
Recent approaches have proven that high-resolution clues can well guide low-resolution face recognition via proper knowledge transfer.
We propose an adaptable instance-relation distillation approach to facilitate low-resolution face recognition.
- Score: 18.709870458307574
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Low-resolution face recognition is a challenging task due to the missing of informative details. Recent approaches based on knowledge distillation have proven that high-resolution clues can well guide low-resolution face recognition via proper knowledge transfer. However, due to the distribution difference between training and testing faces, the learned models often suffer from poor adaptability. To address that, we split the knowledge transfer process into distillation and adaptation steps, and propose an adaptable instance-relation distillation approach to facilitate low-resolution face recognition. In the approach, the student distills knowledge from high-resolution teacher in both instance level and relation level, providing sufficient cross-resolution knowledge transfer. Then, the learned student can be adaptable to recognize low-resolution faces with adaptive batch normalization in inference. In this manner, the capability of recovering missing details of familiar low-resolution faces can be effectively enhanced, leading to a better knowledge transfer. Extensive experiments on low-resolution face recognition clearly demonstrate the effectiveness and adaptability of our approach.
Related papers
- Efficient Low-Resolution Face Recognition via Bridge Distillation [40.823152928253776]
We propose a bridge distillation approach to turn a complex face model pretrained on private high-resolution faces into a light-weight one for low-resolution face recognition.
Experimental results show that the student model performs impressively in recognizing low-resolution faces with only 0.21M parameters and 0.057MB memory.
arXiv Detail & Related papers (2024-09-18T08:10:35Z) - Distilling Generative-Discriminative Representations for Very Low-Resolution Face Recognition [19.634712802639356]
Very low-resolution face recognition is challenging due to loss of informative facial details in resolution degradation.
We propose a generative-discriminative representation distillation approach that combines generative representation with cross-resolution aligned knowledge distillation.
Our approach improves the recovery of the missing details in very low-resolution faces and achieves better knowledge transfer.
arXiv Detail & Related papers (2024-09-10T09:53:06Z) - Look One and More: Distilling Hybrid Order Relational Knowledge for Cross-Resolution Image Recognition [30.568519905346253]
We propose a teacher-student learning approach to facilitate low-resolution image recognition via hybrid order relational knowledge distillation.
The approach refers to three streams: the teacher stream is pretrained to recognize high-resolution images in high accuracy, the student stream is learned to identify low-resolution images by mimicking the teacher's behaviors, and the extra assistant stream is introduced as bridge to help knowledge transfer across the teacher to the student.
arXiv Detail & Related papers (2024-09-09T07:32:18Z) - Low-Resolution Object Recognition with Cross-Resolution Relational Contrastive Distillation [22.26932361388872]
We propose a cross-resolution relational contrastive distillation approach to facilitate low-resolution object recognition.
Our approach enables the student model to mimic the behavior of a well-trained teacher model.
In this manner, the capability of recovering missing details of familiar low-resolution objects can be effectively enhanced.
arXiv Detail & Related papers (2024-09-04T09:21:13Z) - Collaborative Knowledge Infusion for Low-resource Stance Detection [83.88515573352795]
Target-related knowledge is often needed to assist stance detection models.
We propose a collaborative knowledge infusion approach for low-resource stance detection tasks.
arXiv Detail & Related papers (2024-03-28T08:32:14Z) - One-stage Low-resolution Text Recognition with High-resolution Knowledge
Transfer [53.02254290682613]
Current solutions for low-resolution text recognition typically rely on a two-stage pipeline.
We propose an efficient and effective knowledge distillation framework to achieve multi-level knowledge transfer.
Experiments show that the proposed one-stage pipeline significantly outperforms super-resolution based two-stage frameworks.
arXiv Detail & Related papers (2023-08-05T02:33:45Z) - Learning Knowledge Representation with Meta Knowledge Distillation for
Single Image Super-Resolution [82.89021683451432]
We propose a model-agnostic meta knowledge distillation method under the teacher-student architecture for the single image super-resolution task.
Experiments conducted on various single image super-resolution datasets demonstrate that our proposed method outperforms existing defined knowledge representation related distillation methods.
arXiv Detail & Related papers (2022-07-18T02:41:04Z) - Parameter-Efficient and Student-Friendly Knowledge Distillation [83.56365548607863]
We present a parameter-efficient and student-friendly knowledge distillation method, namely PESF-KD, to achieve efficient and sufficient knowledge transfer.
Experiments on a variety of benchmarks show that PESF-KD can significantly reduce the training cost while obtaining competitive results compared to advanced online distillation methods.
arXiv Detail & Related papers (2022-05-28T16:11:49Z) - Multi-Scale Aligned Distillation for Low-Resolution Detection [68.96325141432078]
This paper focuses on boosting the performance of low-resolution models by distilling knowledge from a high- or multi-resolution model.
On several instance-level detection tasks and datasets, the low-resolution models trained via our approach perform competitively with high-resolution models trained via conventional multi-scale training.
arXiv Detail & Related papers (2021-09-14T12:53:35Z) - Transfer Heterogeneous Knowledge Among Peer-to-Peer Teammates: A Model
Distillation Approach [55.83558520598304]
We propose a brand new solution to reuse experiences and transfer value functions among multiple students via model distillation.
We also describe how to design an efficient communication protocol to exploit heterogeneous knowledge.
Our proposed framework, namely Learning and Teaching Categorical Reinforcement, shows promising performance on stabilizing and accelerating learning progress.
arXiv Detail & Related papers (2020-02-06T11:31:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.