Distilling Image Classifiers in Object Detectors
- URL: http://arxiv.org/abs/2106.05209v1
- Date: Wed, 9 Jun 2021 16:50:10 GMT
- Title: Distilling Image Classifiers in Object Detectors
- Authors: Shuxuan Guo and Jose M. Alvarez and Mathieu Salzmann
- Abstract summary: We study the case of object detection and, instead of following the standard detector-to-detector distillation approach, introduce a classifier-to-detector knowledge transfer framework.
In particular, we propose strategies to exploit the classification teacher to improve both the detector's recognition accuracy and localization performance.
- Score: 81.63849985128527
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Knowledge distillation constitutes a simple yet effective way to improve the
performance of a compact student network by exploiting the knowledge of a more
powerful teacher. Nevertheless, the knowledge distillation literature remains
limited to the scenario where the student and the teacher tackle the same task.
Here, we investigate the problem of transferring knowledge not only across
architectures but also across tasks. To this end, we study the case of object
detection and, instead of following the standard detector-to-detector
distillation approach, introduce a classifier-to-detector knowledge transfer
framework. In particular, we propose strategies to exploit the classification
teacher to improve both the detector's recognition accuracy and localization
performance. Our experiments on several detectors with different backbones
demonstrate the effectiveness of our approach, allowing us to outperform the
state-of-the-art detector-to-detector distillation methods.
Related papers
- Task Integration Distillation for Object Detectors [2.974025533366946]
We propose a knowledge distillation method that addresses both the classification and regression tasks.
We evaluate the importance of features based on the output of the detector's two sub-tasks.
This method effectively prevents the issue of biased predictions about the model's learning reality.
arXiv Detail & Related papers (2024-04-02T07:08:15Z) - Efficient Object Detection in Optical Remote Sensing Imagery via
Attention-based Feature Distillation [29.821082433621868]
We propose Attention-based Feature Distillation (AFD) for object detection.
We introduce a multi-instance attention mechanism that effectively distinguishes between background and foreground elements.
AFD attains the performance of other state-of-the-art models while being efficient.
arXiv Detail & Related papers (2023-10-28T11:15:37Z) - Learning Lightweight Object Detectors via Multi-Teacher Progressive
Distillation [56.053397775016755]
We propose a sequential approach to knowledge distillation that progressively transfers the knowledge of a set of teacher detectors to a given lightweight student.
To the best of our knowledge, we are the first to successfully distill knowledge from Transformer-based teacher detectors to convolution-based students.
arXiv Detail & Related papers (2023-08-17T17:17:08Z) - Exploring Inconsistent Knowledge Distillation for Object Detection with
Data Augmentation [66.25738680429463]
Knowledge Distillation (KD) for object detection aims to train a compact detector by transferring knowledge from a teacher model.
We propose inconsistent knowledge distillation (IKD) which aims to distill knowledge inherent in the teacher model's counter-intuitive perceptions.
Our method outperforms state-of-the-art KD baselines on one-stage, two-stage and anchor-free object detectors.
arXiv Detail & Related papers (2022-09-20T16:36:28Z) - Open-Vocabulary One-Stage Detection with Hierarchical Visual-Language
Knowledge Distillation [36.79599282372021]
We propose a hierarchical visual-language knowledge distillation method, i.e., HierKD, for open-vocabulary one-stage detection.
Our method significantly surpasses the previous best one-stage detector with 11.9% and 6.7% $AP_50$ gains.
arXiv Detail & Related papers (2022-03-20T16:31:49Z) - Response-based Distillation for Incremental Object Detection [2.337183337110597]
Traditional object detection are ill-equipped for incremental learning.
Fine-tuning directly on a well-trained detection model with only new data will leads to catastrophic forgetting.
We propose a fully response-based incremental distillation method focusing on learning response from detection bounding boxes and classification predictions.
arXiv Detail & Related papers (2021-10-26T08:07:55Z) - Label Assignment Distillation for Object Detection [0.0]
We come up with a simple but effective knowledge distillation approach focusing on label assignment in object detection.
Our method shows encouraging results on the MSCOCO 2017 benchmark.
arXiv Detail & Related papers (2021-09-16T10:11:58Z) - Distilling Knowledge via Knowledge Review [69.15050871776552]
We study the factor of connection path cross levels between teacher and student networks, and reveal its great importance.
For the first time in knowledge distillation, cross-stage connection paths are proposed.
Our finally designed nested and compact framework requires negligible overhead, and outperforms other methods on a variety of tasks.
arXiv Detail & Related papers (2021-04-19T04:36:24Z) - Robust and Accurate Object Detection via Adversarial Learning [111.36192453882195]
This work augments the fine-tuning stage for object detectors by exploring adversarial examples.
Our approach boosts the performance of state-of-the-art EfficientDets by +1.1 mAP on the object detection benchmark.
arXiv Detail & Related papers (2021-03-23T19:45:26Z) - Knowledge Distillation Meets Self-Supervision [109.6400639148393]
Knowledge distillation involves extracting "dark knowledge" from a teacher network to guide the learning of a student network.
We show that the seemingly different self-supervision task can serve as a simple yet powerful solution.
By exploiting the similarity between those self-supervision signals as an auxiliary task, one can effectively transfer the hidden information from the teacher to the student.
arXiv Detail & Related papers (2020-06-12T12:18:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.