Anomaly Detection via Reverse Distillation from One-Class Embedding
- URL: http://arxiv.org/abs/2201.10703v1
- Date: Wed, 26 Jan 2022 01:48:37 GMT
- Title: Anomaly Detection via Reverse Distillation from One-Class Embedding
- Authors: Hanqiu Deng, Xingyu Li
- Abstract summary: We propose a novel T-S model consisting of a teacher encoder and a student decoder.
Instead of receiving raw images directly, the student network takes teacher model's one-class embedding as input.
In addition, we introduce a trainable one-class bottleneck embedding module in our T-S model.
- Score: 2.715884199292287
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Knowledge distillation (KD) achieves promising results on the challenging
problem of unsupervised anomaly detection (AD).The representation discrepancy
of anomalies in the teacher-student (T-S) model provides essential evidence for
AD. However, using similar or identical architectures to build the teacher and
student models in previous studies hinders the diversity of anomalous
representations. To tackle this problem, we propose a novel T-S model
consisting of a teacher encoder and a student decoder and introduce a simple
yet effective "reverse distillation" paradigm accordingly. Instead of receiving
raw images directly, the student network takes teacher model's one-class
embedding as input and targets to restore the teacher's multiscale
representations. Inherently, knowledge distillation in this study starts from
abstract, high-level presentations to low-level features. In addition, we
introduce a trainable one-class bottleneck embedding (OCBE) module in our T-S
model. The obtained compact embedding effectively preserves essential
information on normal patterns, but abandons anomaly perturbations. Extensive
experimentation on AD and one-class novelty detection benchmarks shows that our
method surpasses SOTA performance, demonstrating our proposed approach's
effectiveness and generalizability.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.