New Perspective on Progressive GANs Distillation for One-class Novelty
Detection
- URL: http://arxiv.org/abs/2109.07295v3
- Date: Fri, 30 Jun 2023 15:32:05 GMT
- Title: New Perspective on Progressive GANs Distillation for One-class Novelty
Detection
- Authors: Zhiwei Zhang, Yu Dong, Hanyu Peng, Shifeng Chen
- Abstract summary: Generative Adversarial Network based on thecoder-Decoder-Encoder scheme (EDE-GAN) achieves state-of-the-art performance.
New technology, Progressive Knowledge Distillation with GANs (P-KDGAN) connects two standard GANs through the designed distillation loss.
Two-step progressive learning continuously augments the performance of student GANs with improved results over single-step approach.
- Score: 21.90786581579228
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: One-class novelty detection is conducted to identify anomalous instances,
with different distributions from the expected normal instances. In this paper,
the Generative Adversarial Network based on the Encoder-Decoder-Encoder scheme
(EDE-GAN) achieves state-of-the-art performance. The two factors bellow serve
the above purpose: 1) The EDE-GAN calculates the distance between two latent
vectors as the anomaly score, which is unlike the previous methods by utilizing
the reconstruction error between images. 2) The model obtains best results when
the batch size is set to 1. To illustrate their superiority, we design a new
GAN architecture, and compare performances according to different batch sizes.
Moreover, with experimentation leads to discovery, our result implies there is
also evidence of just how beneficial constraint on the latent space are when
engaging in model training. In an attempt to learn compact and fast models, we
present a new technology, Progressive Knowledge Distillation with GANs
(P-KDGAN), which connects two standard GANs through the designed distillation
loss. Two-step progressive learning continuously augments the performance of
student GANs with improved results over single-step approach. Our experimental
results on CIFAR-10, MNIST, and FMNIST datasets illustrate that P-KDGAN
improves the performance of the student GAN by 2.44%, 1.77%, and 1.73% when
compressing the computationat ratios of 24.45:1, 311.11:1, and 700:1,
respectively.
Related papers
- Optimizing OOD Detection in Molecular Graphs: A Novel Approach with Diffusion Models [71.39421638547164]
We propose to detect OOD molecules by adopting an auxiliary diffusion model-based framework, which compares similarities between input molecules and reconstructed graphs.
Due to the generative bias towards reconstructing ID training samples, the similarity scores of OOD molecules will be much lower to facilitate detection.
Our research pioneers an approach of Prototypical Graph Reconstruction for Molecular OOD Detection, dubbed as PGR-MOOD and hinges on three innovations.
arXiv Detail & Related papers (2024-04-24T03:25:53Z) - Attention based Dual-Branch Complex Feature Fusion Network for
Hyperspectral Image Classification [1.3249509346606658]
The proposed model is evaluated on the Pavia University and Salinas datasets.
Results show that the proposed model outperforms state-of-the-art methods in terms of overall accuracy, average accuracy, and Kappa.
arXiv Detail & Related papers (2023-11-02T22:31:24Z) - One-for-All: Bridge the Gap Between Heterogeneous Architectures in
Knowledge Distillation [69.65734716679925]
Knowledge distillation has proven to be a highly effective approach for enhancing model performance through a teacher-student training scheme.
Most existing distillation methods are designed under the assumption that the teacher and student models belong to the same model family.
We propose a simple yet effective one-for-all KD framework called OFA-KD, which significantly improves the distillation performance between heterogeneous architectures.
arXiv Detail & Related papers (2023-10-30T11:13:02Z) - DuDGAN: Improving Class-Conditional GANs via Dual-Diffusion [2.458437232470188]
Class-conditional image generation using generative adversarial networks (GANs) has been investigated through various techniques.
We propose a novel approach for class-conditional image generation using GANs called DuDGAN, which incorporates a dual diffusion-based noise injection process.
Our method outperforms state-of-the-art conditional GAN models for image generation in terms of performance.
arXiv Detail & Related papers (2023-05-24T07:59:44Z) - Revisiting the Evaluation of Image Synthesis with GANs [55.72247435112475]
This study presents an empirical investigation into the evaluation of synthesis performance, with generative adversarial networks (GANs) as a representative of generative models.
In particular, we make in-depth analyses of various factors, including how to represent a data point in the representation space, how to calculate a fair distance using selected samples, and how many instances to use from each set.
arXiv Detail & Related papers (2023-04-04T17:54:32Z) - FakeCLR: Exploring Contrastive Learning for Solving Latent Discontinuity
in Data-Efficient GANs [24.18718734850797]
Data-Efficient GANs (DE-GANs) aim to learn generative models with a limited amount of training data.
Contrastive learning has shown the great potential of increasing the synthesis quality of DE-GANs.
We propose FakeCLR, which only applies contrastive learning on fake samples.
arXiv Detail & Related papers (2022-07-18T14:23:38Z) - An Empirical Study on GANs with Margin Cosine Loss and Relativistic
Discriminator [4.899818550820575]
We introduce a new loss function, namely Relativistic Margin Cosine Loss (RMCosGAN)
We compare RMCosGAN performance with existing loss functions based on two metrics: Frechet inception distance and inception score.
The experimental results show that RMCosGAN outperforms the existing ones and significantly improves the quality of images generated.
arXiv Detail & Related papers (2021-10-21T17:25:47Z) - ZARTS: On Zero-order Optimization for Neural Architecture Search [94.41017048659664]
Differentiable architecture search (DARTS) has been a popular one-shot paradigm for NAS due to its high efficiency.
This work turns to zero-order optimization and proposes a novel NAS scheme, called ZARTS, to search without enforcing the above approximation.
In particular, results on 12 benchmarks verify the outstanding robustness of ZARTS, where the performance of DARTS collapses due to its known instability issue.
arXiv Detail & Related papers (2021-10-10T09:35:15Z) - Disentangle Your Dense Object Detector [82.22771433419727]
Deep learning-based dense object detectors have achieved great success in the past few years and have been applied to numerous multimedia applications such as video understanding.
However, the current training pipeline for dense detectors is compromised to lots of conjunctions that may not hold.
We propose Disentangled Dense Object Detector (DDOD), in which simple and effective disentanglement mechanisms are designed and integrated into the current state-of-the-art detectors.
arXiv Detail & Related papers (2021-07-07T00:52:16Z) - P-KDGAN: Progressive Knowledge Distillation with GANs for One-class
Novelty Detection [24.46562699161406]
One-class novelty detection is to identify anomalous instances that do not conform to the expected normal instances.
Deep neural networks are too over- parameterized to deploy on resource-limited devices.
Progressive Knowledge Distillation with GANs (PKDGAN) is proposed to learn compact and fast novelty detection networks.
arXiv Detail & Related papers (2020-07-14T10:44:57Z) - DrNAS: Dirichlet Neural Architecture Search [88.56953713817545]
We treat the continuously relaxed architecture mixing weight as random variables, modeled by Dirichlet distribution.
With recently developed pathwise derivatives, the Dirichlet parameters can be easily optimized with gradient-based generalization.
To alleviate the large memory consumption of differentiable NAS, we propose a simple yet effective progressive learning scheme.
arXiv Detail & Related papers (2020-06-18T08:23:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.