Tail of Distribution GAN (TailGAN): Generative-
Adversarial-Network-Based Boundary Formation
- URL: http://arxiv.org/abs/2107.11658v1
- Date: Sat, 24 Jul 2021 17:29:21 GMT
- Title: Tail of Distribution GAN (TailGAN): Generative-
Adversarial-Network-Based Boundary Formation
- Authors: Nikolaos Dionelis
- Abstract summary: We create a GAN-based tail formation model for anomaly detection, the Tail of distribution GAN (TailGAN)
Using TailGAN, we leverage GANs for anomaly detection and use maximum entropy regularization.
We evaluate TailGAN for identifying Out-of-Distribution (OoD) data and its performance evaluated on MNIST, CIFAR-10, Baggage X-Ray, and OoD data shows competitiveness compared to methods from the literature.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Generative Adversarial Networks (GAN) are a powerful methodology and can be
used for unsupervised anomaly detection, where current techniques have
limitations such as the accurate detection of anomalies near the tail of a
distribution. GANs generally do not guarantee the existence of a probability
density and are susceptible to mode collapse, while few GANs use likelihood to
reduce mode collapse. In this paper, we create a GAN-based tail formation model
for anomaly detection, the Tail of distribution GAN (TailGAN), to generate
samples on the tail of the data distribution and detect anomalies near the
support boundary. Using TailGAN, we leverage GANs for anomaly detection and use
maximum entropy regularization. Using GANs that learn the probability of the
underlying distribution has advantages in improving the anomaly detection
methodology by allowing us to devise a generator for boundary samples, and use
this model to characterize anomalies. TailGAN addresses supports with disjoint
components and achieves competitive performance on images. We evaluate TailGAN
for identifying Out-of-Distribution (OoD) data and its performance evaluated on
MNIST, CIFAR-10, Baggage X-Ray, and OoD data shows competitiveness compared to
methods from the literature.
Related papers
- GLAD: Towards Better Reconstruction with Global and Local Adaptive Diffusion Models for Unsupervised Anomaly Detection [60.78684630040313]
Diffusion models tend to reconstruct normal counterparts of test images with certain noises added.
From the global perspective, the difficulty of reconstructing images with different anomalies is uneven.
We propose a global and local adaptive diffusion model (abbreviated to GLAD) for unsupervised anomaly detection.
arXiv Detail & Related papers (2024-06-11T17:27:23Z) - Spot The Odd One Out: Regularized Complete Cycle Consistent Anomaly Detector GAN [4.5123329001179275]
This study presents an adversarial method for anomaly detection in real-world applications, leveraging the power of generative adversarial neural networks (GANs)
Previous methods suffer from the high variance between class-wise accuracy which leads to not being applicable for all types of anomalies.
The proposed method named RCALAD tries to solve this problem by introducing a novel discriminator to the structure, which results in a more efficient training process.
arXiv Detail & Related papers (2023-04-16T13:05:39Z) - Window-Based Distribution Shift Detection for Deep Neural Networks [21.73028341299301]
We study the case of monitoring the healthy operation of a deep neural network (DNN) receiving a stream of data.
Using selective prediction principles, we propose a distribution deviation detection method for DNNs.
Our novel detection method performs on-par or better than the state-of-the-art, while consuming substantially lower time.
arXiv Detail & Related papers (2022-10-19T21:27:25Z) - Robust Estimation for Nonparametric Families via Generative Adversarial
Networks [92.64483100338724]
We provide a framework for designing Generative Adversarial Networks (GANs) to solve high dimensional robust statistics problems.
Our work extend these to robust mean estimation, second moment estimation, and robust linear regression.
In terms of techniques, our proposed GAN losses can be viewed as a smoothed and generalized Kolmogorov-Smirnov distance.
arXiv Detail & Related papers (2022-02-02T20:11:33Z) - OMASGAN: Out-of-Distribution Minimum Anomaly Score GAN for Sample
Generation on the Boundary [0.0]
Generative models set high likelihood and low reconstruction loss to Out-of-Distribution (OoD) samples.
OMASGAN generates, in a negative data augmentation manner, anomalous samples on the estimated distribution boundary.
OMASGAN performs retraining by including the abnormal minimum-anomaly-score OoD samples generated on the distribution boundary.
arXiv Detail & Related papers (2021-10-28T16:35:30Z) - Boundary of Distribution Support Generator (BDSG): Sample Generation on
the Boundary [0.0]
We use the recently developed Invertible Residual Network (IResNet) and Residual Flow (ResFlow) for density estimation.
These models have not yet been used for anomaly detection.
arXiv Detail & Related papers (2021-07-21T09:00:32Z) - GANs with Variational Entropy Regularizers: Applications in Mitigating
the Mode-Collapse Issue [95.23775347605923]
Building on the success of deep learning, Generative Adversarial Networks (GANs) provide a modern approach to learn a probability distribution from observed samples.
GANs often suffer from the mode collapse issue where the generator fails to capture all existing modes of the input distribution.
We take an information-theoretic approach and maximize a variational lower bound on the entropy of the generated samples to increase their diversity.
arXiv Detail & Related papers (2020-09-24T19:34:37Z) - Improving Generative Adversarial Networks with Local Coordinate Coding [150.24880482480455]
Generative adversarial networks (GANs) have shown remarkable success in generating realistic data from some predefined prior distribution.
In practice, semantic information might be represented by some latent distribution learned from data.
We propose an LCCGAN model with local coordinate coding (LCC) to improve the performance of generating data.
arXiv Detail & Related papers (2020-07-28T09:17:50Z) - GANs with Conditional Independence Graphs: On Subadditivity of
Probability Divergences [70.30467057209405]
Generative Adversarial Networks (GANs) are modern methods to learn the underlying distribution of a data set.
GANs are designed in a model-free fashion where no additional information about the underlying distribution is available.
We propose a principled design of a model-based GAN that uses a set of simple discriminators on the neighborhoods of the Bayes-net/MRF.
arXiv Detail & Related papers (2020-03-02T04:31:22Z) - Distribution Approximation and Statistical Estimation Guarantees of
Generative Adversarial Networks [82.61546580149427]
Generative Adversarial Networks (GANs) have achieved a great success in unsupervised learning.
This paper provides approximation and statistical guarantees of GANs for the estimation of data distributions with densities in a H"older space.
arXiv Detail & Related papers (2020-02-10T16:47:57Z) - Regularized Cycle Consistent Generative Adversarial Network for Anomaly
Detection [5.457279006229213]
We propose a new Regularized Cycle Consistent Generative Adversarial Network (RCGAN) in which deep neural networks are adversarially trained to better recognize anomalous samples.
Experimental results on both real-world and synthetic data show that our model leads to significant and consistent improvements on previous anomaly detection benchmarks.
arXiv Detail & Related papers (2020-01-18T03:35:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.