Improved AutoEncoder with LSTM module and KL divergence
- URL: http://arxiv.org/abs/2404.19247v2
- Date: Sun, 17 Nov 2024 01:41:12 GMT
- Title: Improved AutoEncoder with LSTM module and KL divergence
- Authors: Wei Huang, Bingyang Zhang, Kaituo Zhang, Hua Gao, Rongchun Wan,
- Abstract summary: We propose Improved AutoEncoder with LSTM module and Kullback-Leibler divergence (IAE-LSTM-KL) model in this paper.
The efficacy of the IAE-LSTM-KL model is validated through experiments on both synthetic and real-world datasets.
- Score: 3.1168862003127797
- License:
- Abstract: The task of anomaly detection is to separate anomalous data from normal data in the dataset. Models such as deep convolutional autoencoder (CAE) network and deep supporting vector data description (SVDD) model have been universally employed and have demonstrated significant success in detecting anomalies. However, the over-reconstruction ability of CAE network for anomalous data can easily lead to high false negative rate in detecting anomalous data. On the other hand, the deep SVDD model has the drawback of feature collapse, which leads to a decrease of detection accuracy for anomalies. To address these problems, we propose the Improved AutoEncoder with LSTM module and Kullback-Leibler divergence (IAE-LSTM-KL) model in this paper. An LSTM network is added after the encoder to memorize feature representations of normal data. In the meanwhile, the phenomenon of feature collapse can also be mitigated by penalizing the featured input to SVDD module via KL divergence. The efficacy of the IAE-LSTM-KL model is validated through experiments on both synthetic and real-world datasets. Experimental results show that IAE-LSTM-KL model yields higher detection accuracy for anomalies. In addition, it is also found that the IAE-LSTM-KL model demonstrates enhanced robustness to contaminated outliers in the dataset. All code may be found at https://github.com/crazyn2/IAE-LSTM-KL_codes
Related papers
- Anomaly Detection of Tabular Data Using LLMs [54.470648484612866]
We show that pre-trained large language models (LLMs) are zero-shot batch-level anomaly detectors.
We propose an end-to-end fine-tuning strategy to bring out the potential of LLMs in detecting real anomalies.
arXiv Detail & Related papers (2024-06-24T04:17:03Z) - Self-supervised Feature Adaptation for 3D Industrial Anomaly Detection [59.41026558455904]
We focus on multi-modal anomaly detection. Specifically, we investigate early multi-modal approaches that attempted to utilize models pre-trained on large-scale visual datasets.
We propose a Local-to-global Self-supervised Feature Adaptation (LSFA) method to finetune the adaptors and learn task-oriented representation toward anomaly detection.
arXiv Detail & Related papers (2024-01-06T07:30:41Z) - A Bi-LSTM Autoencoder Framework for Anomaly Detection -- A Case Study of
a Wind Power Dataset [2.094022863940315]
Anomalies refer to data points or events that deviate from normal and homogeneous events.
This study presents a novel framework for time series anomaly detection using a combination of Bi-LSTM architecture and Autoencoder.
The Bi-LSTM Autoencoder model achieved a classification accuracy of 96.79% and outperformed more commonly used LSTM Autoencoder models.
arXiv Detail & Related papers (2023-03-17T00:24:28Z) - Adaptive Memory Networks with Self-supervised Learning for Unsupervised
Anomaly Detection [54.76993389109327]
Unsupervised anomaly detection aims to build models to detect unseen anomalies by only training on the normal data.
We propose a novel approach called Adaptive Memory Network with Self-supervised Learning (AMSL) to address these challenges.
AMSL incorporates a self-supervised learning module to learn general normal patterns and an adaptive memory fusion module to learn rich feature representations.
arXiv Detail & Related papers (2022-01-03T03:40:21Z) - DAE : Discriminatory Auto-Encoder for multivariate time-series anomaly
detection in air transportation [68.8204255655161]
We propose a novel anomaly detection model called Discriminatory Auto-Encoder (DAE)
It uses the baseline of a regular LSTM-based auto-encoder but with several decoders, each getting data of a specific flight phase.
Results show that the DAE achieves better results in both accuracy and speed of detection.
arXiv Detail & Related papers (2021-09-08T14:07:55Z) - Anomaly Detection Based on Multiple-Hypothesis Autoencoder [0.0]
A model trained with normal data generates a larger restoration error for abnormal data.
The restoration area for the input data of AE is limited in the latent space.
We propose Multiple-hypothesis Autoencoder(MH-AE) model composed of several decoders.
arXiv Detail & Related papers (2021-07-07T05:09:03Z) - Detecting Faults during Automatic Screwdriving: A Dataset and Use Case
of Anomaly Detection for Automatic Screwdriving [80.6725125503521]
Data-driven approaches, using Machine Learning (ML) for detecting faults have recently gained increasing interest.
We present a use case of using ML models for detecting faults during automated screwdriving operations.
arXiv Detail & Related papers (2021-07-05T11:46:00Z) - Unsupervised Online Anomaly Detection On Irregularly Sampled Or Missing
Valued Time-Series Data Using LSTM Networks [0.0]
We study anomaly detection and introduce an algorithm that processes variable length, irregularly sampled sequences or sequences with missing values.
Our algorithm is fully unsupervised, however, can be readily extended to supervised or semisupervised cases.
arXiv Detail & Related papers (2020-05-25T09:41:04Z) - Contextual-Bandit Anomaly Detection for IoT Data in Distributed
Hierarchical Edge Computing [65.78881372074983]
IoT devices can hardly afford complex deep neural networks (DNN) models, and offloading anomaly detection tasks to the cloud incurs long delay.
We propose and build a demo for an adaptive anomaly detection approach for distributed hierarchical edge computing (HEC) systems.
We show that our proposed approach significantly reduces detection delay without sacrificing accuracy, as compared to offloading detection tasks to the cloud.
arXiv Detail & Related papers (2020-04-15T06:13:33Z) - Anomaly Detection with SDAE [2.9447568514391067]
A Simple, Deep, and Supervised Deep Autoencoder were trained and compared for anomaly detection over the ASHRAE building energy dataset.
The Deep Autoencoder perfoms the best, however the Supervised Deep Autoencoder outperforms the other models in total anomalies detected.
arXiv Detail & Related papers (2020-04-09T07:22:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.