Anomaly Detection with Density Estimation
- URL: http://arxiv.org/abs/2001.04990v2
- Date: Sun, 10 May 2020 04:48:08 GMT
- Title: Anomaly Detection with Density Estimation
- Authors: Benjamin Nachman and David Shih
- Abstract summary: We propose a new unsupervised anomaly detection technique (ANODE)
By estimating the probability density of the data in a signal region and in sidebands, a likelihood ratio of data vs. background can be constructed.
ANODE is robust against systematic differences between signal region and sidebands, giving it broader applicability than other methods.
- Score: 2.0813318162800707
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We leverage recent breakthroughs in neural density estimation to propose a
new unsupervised anomaly detection technique (ANODE). By estimating the
probability density of the data in a signal region and in sidebands, and
interpolating the latter into the signal region, a likelihood ratio of data vs.
background can be constructed. This likelihood ratio is broadly sensitive to
overdensities in the data that could be due to localized anomalies. In
addition, a unique potential benefit of the ANODE method is that the background
can be directly estimated using the learned densities. Finally, ANODE is robust
against systematic differences between signal region and sidebands, giving it
broader applicability than other methods. We demonstrate the power of this new
approach using the LHC Olympics 2020 R\&D Dataset. We show how ANODE can
enhance the significance of a dijet bump hunt by up to a factor of 7 with a
10\% accuracy on the background prediction. While the LHC is used as the
recurring example, the methods developed here have a much broader applicability
to anomaly detection in physics and beyond.
Related papers
- Generative Edge Detection with Stable Diffusion [52.870631376660924]
Edge detection is typically viewed as a pixel-level classification problem mainly addressed by discriminative methods.
We propose a novel approach, named Generative Edge Detector (GED), by fully utilizing the potential of the pre-trained stable diffusion model.
We conduct extensive experiments on multiple datasets and achieve competitive performance.
arXiv Detail & Related papers (2024-10-04T01:52:23Z) - Towards the Uncharted: Density-Descending Feature Perturbation for Semi-supervised Semantic Segmentation [51.66997548477913]
We propose a novel feature-level consistency learning framework named Density-Descending Feature Perturbation (DDFP)
Inspired by the low-density separation assumption in semi-supervised learning, our key insight is that feature density can shed a light on the most promising direction for the segmentation classifier to explore.
The proposed DDFP outperforms other designs on feature-level perturbations and shows state of the art performances on both Pascal VOC and Cityscapes dataset.
arXiv Detail & Related papers (2024-03-11T06:59:05Z) - Residual ANODE [0.0]
We present R-ANODE, a new method for data-driven, model-agnostic resonant anomaly detection.
The key to R-ANODE is to enhance the inductive bias of the anomaly detection task by fitting a normalizing flow directly to the small and unknown signal component.
We show that the method works equally well whether the unknown signal fraction is learned or fixed, and is even robust to signal fraction misspecification.
arXiv Detail & Related papers (2023-12-18T19:00:03Z) - Are we certain it's anomalous? [57.729669157989235]
Anomaly detection in time series is a complex task since anomalies are rare due to highly non-linear temporal correlations.
Here we propose the novel use of Hyperbolic uncertainty for Anomaly Detection (HypAD)
HypAD learns self-supervisedly to reconstruct the input signal.
arXiv Detail & Related papers (2022-11-16T21:31:39Z) - Window-Based Distribution Shift Detection for Deep Neural Networks [21.73028341299301]
We study the case of monitoring the healthy operation of a deep neural network (DNN) receiving a stream of data.
Using selective prediction principles, we propose a distribution deviation detection method for DNNs.
Our novel detection method performs on-par or better than the state-of-the-art, while consuming substantially lower time.
arXiv Detail & Related papers (2022-10-19T21:27:25Z) - Meta Learning Low Rank Covariance Factors for Energy-Based Deterministic
Uncertainty [58.144520501201995]
Bi-Lipschitz regularization of neural network layers preserve relative distances between data instances in the feature spaces of each layer.
With the use of an attentive set encoder, we propose to meta learn either diagonal or diagonal plus low-rank factors to efficiently construct task specific covariance matrices.
We also propose an inference procedure which utilizes scaled energy to achieve a final predictive distribution.
arXiv Detail & Related papers (2021-10-12T22:04:19Z) - Density-Based Clustering with Kernel Diffusion [59.4179549482505]
A naive density corresponding to the indicator function of a unit $d$-dimensional Euclidean ball is commonly used in density-based clustering algorithms.
We propose a new kernel diffusion density function, which is adaptive to data of varying local distributional characteristics and smoothness.
arXiv Detail & Related papers (2021-10-11T09:00:33Z) - DAAIN: Detection of Anomalous and Adversarial Input using Normalizing
Flows [52.31831255787147]
We introduce a novel technique, DAAIN, to detect out-of-distribution (OOD) inputs and adversarial attacks (AA)
Our approach monitors the inner workings of a neural network and learns a density estimator of the activation distribution.
Our model can be trained on a single GPU making it compute efficient and deployable without requiring specialized accelerators.
arXiv Detail & Related papers (2021-05-30T22:07:13Z) - Exploring the Intrinsic Probability Distribution for Hyperspectral
Anomaly Detection [9.653976364051564]
We propose a novel probability distribution representation detector (PDRD) that explores the intrinsic distribution of both the background and the anomalies in original data for hyperspectral anomaly detection.
We conduct the experiments on four real data sets to evaluate the performance of our proposed method.
arXiv Detail & Related papers (2021-05-14T11:42:09Z) - Deep Data Density Estimation through Donsker-Varadhan Representation [5.276937617129594]
We present a simple yet effective method for estimating the data density using a deep neural network and the Donsker-Varadhan variational lower bound on the KL divergence.
We show that the optimal critic function associated with the Donsker-Varadhan representation on the divergence between the data and the uniform distribution can estimate the data density.
arXiv Detail & Related papers (2021-04-14T03:38:32Z) - Regularized Cycle Consistent Generative Adversarial Network for Anomaly
Detection [5.457279006229213]
We propose a new Regularized Cycle Consistent Generative Adversarial Network (RCGAN) in which deep neural networks are adversarially trained to better recognize anomalous samples.
Experimental results on both real-world and synthetic data show that our model leads to significant and consistent improvements on previous anomaly detection benchmarks.
arXiv Detail & Related papers (2020-01-18T03:35:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.