An LSTM-Based Predictive Monitoring Method for Data with Time-varying
Variability
- URL: http://arxiv.org/abs/2309.01978v1
- Date: Tue, 5 Sep 2023 06:13:09 GMT
- Title: An LSTM-Based Predictive Monitoring Method for Data with Time-varying
Variability
- Authors: Jiaqi Qiu, Yu Lin, Inez Zwetsloot
- Abstract summary: This paper explores the ability of the recurrent neural network structure to monitor processes.
It proposes a control chart based on long short-term memory (LSTM) prediction intervals for data with time-varying variability.
The proposed method is also applied to time series sensor data, which confirms that the proposed method is an effective technique for detecting abnormalities.
- Score: 3.5246670856011035
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: The recurrent neural network and its variants have shown great success in
processing sequences in recent years. However, this deep neural network has not
aroused much attention in anomaly detection through predictively process
monitoring. Furthermore, the traditional statistic models work on assumptions
and hypothesis tests, while neural network (NN) models do not need that many
assumptions. This flexibility enables NN models to work efficiently on data
with time-varying variability, a common inherent aspect of data in practice.
This paper explores the ability of the recurrent neural network structure to
monitor processes and proposes a control chart based on long short-term memory
(LSTM) prediction intervals for data with time-varying variability. The
simulation studies provide empirical evidence that the proposed model
outperforms other NN-based predictive monitoring methods for mean shift
detection. The proposed method is also applied to time series sensor data,
which confirms that the proposed method is an effective technique for detecting
abnormalities.
Related papers
- How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Continuous time recurrent neural networks: overview and application to
forecasting blood glucose in the intensive care unit [56.801856519460465]
Continuous time autoregressive recurrent neural networks (CTRNNs) are a deep learning model that account for irregular observations.
We demonstrate the application of these models to probabilistic forecasting of blood glucose in a critical care setting.
arXiv Detail & Related papers (2023-04-14T09:39:06Z) - Brain-Inspired Spiking Neural Network for Online Unsupervised Time
Series Prediction [13.521272923545409]
We present a novel Continuous Learning-based Unsupervised Recurrent Spiking Neural Network Model (CLURSNN)
CLURSNN makes online predictions by reconstructing the underlying dynamical system using Random Delay Embedding.
We show that the proposed online time series prediction methodology outperforms state-of-the-art DNN models when predicting an evolving Lorenz63 dynamical system.
arXiv Detail & Related papers (2023-04-10T16:18:37Z) - Online Evolutionary Neural Architecture Search for Multivariate
Non-Stationary Time Series Forecasting [72.89994745876086]
This work presents the Online Neuro-Evolution-based Neural Architecture Search (ONE-NAS) algorithm.
ONE-NAS is a novel neural architecture search method capable of automatically designing and dynamically training recurrent neural networks (RNNs) for online forecasting tasks.
Results demonstrate that ONE-NAS outperforms traditional statistical time series forecasting methods.
arXiv Detail & Related papers (2023-02-20T22:25:47Z) - DeepBayes -- an estimator for parameter estimation in stochastic
nonlinear dynamical models [11.917949887615567]
We propose DeepBayes estimators that leverage the power of deep recurrent neural networks in learning an estimator.
The deep recurrent neural network architectures can be trained offline and ensure significant time savings during inference.
We demonstrate the applicability of our proposed method on different example models and perform detailed comparisons with state-of-the-art approaches.
arXiv Detail & Related papers (2022-05-04T18:12:17Z) - MAD: Self-Supervised Masked Anomaly Detection Task for Multivariate Time
Series [14.236092062538653]
Masked Anomaly Detection (MAD) is a general self-supervised learning task for multivariate time series anomaly detection.
By randomly masking a portion of the inputs and training a model to estimate them, MAD is an improvement over the traditional left-to-right next step prediction (NSP) task.
Our experimental results demonstrate that MAD can achieve better anomaly detection rates over traditional NSP approaches.
arXiv Detail & Related papers (2022-05-04T14:55:42Z) - Hybridization of Capsule and LSTM Networks for unsupervised anomaly
detection on multivariate data [0.0]
This paper introduces a novel NN architecture which hybridises the Long-Short-Term-Memory (LSTM) and Capsule Networks into a single network.
The proposed method uses an unsupervised learning technique to overcome the issues with finding large volumes of labelled training data.
arXiv Detail & Related papers (2022-02-11T10:33:53Z) - Convolutional generative adversarial imputation networks for
spatio-temporal missing data in storm surge simulations [86.5302150777089]
Generative Adversarial Imputation Nets (GANs) and GAN-based techniques have attracted attention as unsupervised machine learning methods.
We name our proposed method as Con Conval Generative Adversarial Imputation Nets (Conv-GAIN)
arXiv Detail & Related papers (2021-11-03T03:50:48Z) - Improving Uncertainty Calibration via Prior Augmented Data [56.88185136509654]
Neural networks have proven successful at learning from complex data distributions by acting as universal function approximators.
They are often overconfident in their predictions, which leads to inaccurate and miscalibrated probabilistic predictions.
We propose a solution by seeking out regions of feature space where the model is unjustifiably overconfident, and conditionally raising the entropy of those predictions towards that of the prior distribution of the labels.
arXiv Detail & Related papers (2021-02-22T07:02:37Z) - Dynamic Time Warping as a New Evaluation for Dst Forecast with Machine
Learning [0.0]
We train a neural network to make a forecast of the disturbance storm time index at origin time $t$ with a forecasting horizon of 1 up to 6 hours.
Inspection of the model's results with the correlation coefficient and RMSE indicated a performance comparable to the latest publications.
A new method is proposed to measure whether two time series are shifted in time with respect to each other.
arXiv Detail & Related papers (2020-06-08T15:14:13Z) - A Multi-Channel Neural Graphical Event Model with Negative Evidence [76.51278722190607]
Event datasets are sequences of events of various types occurring irregularly over the time-line.
We propose a non-parametric deep neural network approach in order to estimate the underlying intensity functions.
arXiv Detail & Related papers (2020-02-21T23:10:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.