AD-NEv++ : The multi-architecture neuroevolution-based multivariate anomaly detection framework
- URL: http://arxiv.org/abs/2404.07968v1
- Date: Mon, 25 Mar 2024 08:40:58 GMT
- Title: AD-NEv++ : The multi-architecture neuroevolution-based multivariate anomaly detection framework
- Authors: Marcin Pietroń, Dominik Żurek, Kamil Faber, Roberto Corizzo,
- Abstract summary: Anomaly detection tools and methods enable key analytical capabilities in modern cyberphysical and sensor-based systems.
We propose AD-NEv++, a three-stage neuroevolution-based method that synergically combines subspace evolution, model evolution, and fine-tuning.
We show that AD-NEv++ can improve and outperform the state-of-the-art GNN (Graph Neural Networks) model architecture in all anomaly detection benchmarks.
- Score: 0.794682109939797
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Anomaly detection tools and methods enable key analytical capabilities in modern cyberphysical and sensor-based systems. Despite the fast-paced development in deep learning architectures for anomaly detection, model optimization for a given dataset is a cumbersome and time-consuming process. Neuroevolution could be an effective and efficient solution to this problem, as a fully automated search method for learning optimal neural networks, supporting both gradient and non-gradient fine tuning. However, existing frameworks incorporating neuroevolution lack of support for new layers and architectures and are typically limited to convolutional and LSTM layers. In this paper we propose AD-NEv++, a three-stage neuroevolution-based method that synergically combines subspace evolution, model evolution, and fine-tuning. Our method overcomes the limitations of existing approaches by optimizing the mutation operator in the neuroevolution process, while supporting a wide spectrum of neural layers, including attention, dense, and graph convolutional layers. Our extensive experimental evaluation was conducted with widely adopted multivariate anomaly detection benchmark datasets, and showed that the models generated by AD-NEv++ outperform well-known deep learning architectures and neuroevolution-based approaches for anomaly detection. Moreover, results show that AD-NEv++ can improve and outperform the state-of-the-art GNN (Graph Neural Networks) model architecture in all anomaly detection benchmarks.
Related papers
- Back to Bayesics: Uncovering Human Mobility Distributions and Anomalies with an Integrated Statistical and Neural Framework [14.899157568336731]
DeepBayesic is a novel framework that integrates Bayesian principles with deep neural networks to model the underlying distributions.
We evaluate our approach on several mobility datasets, demonstrating significant improvements over state-of-the-art anomaly detection methods.
arXiv Detail & Related papers (2024-10-01T19:02:06Z) - Coevolution of Neural Architectures and Features for Stock Market
Forecasting: A Multi-objective Decision Perspective [0.0]
This paper proposes a new approach to identify a set of nondominated neural network models for further selection by the decision maker.
A new coevolution approach is proposed to simultaneously select the features and topology of neural networks.
The results on the NASDAQ index in pre and peri COVID time windows convincingly demonstrate that the proposed coevolution approach can evolve a set of nondominated neural forecasting models with better generalization capabilities.
arXiv Detail & Related papers (2023-11-23T15:12:30Z) - AD-NEV: A Scalable Multi-level Neuroevolution Framework for Multivariate
Anomaly Detection [1.0323063834827415]
Anomaly detection tools and methods present a key capability in modern cyberphysical and failure prediction systems.
Model optimization for a given dataset is a cumbersome and time consuming process.
We propose Anomaly Detection Neuroevolution (AD-NEv) - a scalable multi-level optimized neuroevolution framework.
arXiv Detail & Related papers (2023-05-25T21:52:38Z) - Learning Large-scale Neural Fields via Context Pruned Meta-Learning [60.93679437452872]
We introduce an efficient optimization-based meta-learning technique for large-scale neural field training.
We show how gradient re-scaling at meta-test time allows the learning of extremely high-quality neural fields.
Our framework is model-agnostic, intuitive, straightforward to implement, and shows significant reconstruction improvements for a wide range of signals.
arXiv Detail & Related papers (2023-02-01T17:32:16Z) - Fast and scalable neuroevolution deep learning architecture search for
multivariate anomaly detection [0.0]
The work concentrates on improvements to multi-level neuroevolution approach for anomaly detection.
The presented framework can be used as an efficient learning network architecture method for any different unsupervised task.
arXiv Detail & Related papers (2021-12-10T16:14:43Z) - Adaptive Anomaly Detection for Internet of Things in Hierarchical Edge
Computing: A Contextual-Bandit Approach [81.5261621619557]
We propose an adaptive anomaly detection scheme with hierarchical edge computing (HEC)
We first construct multiple anomaly detection DNN models with increasing complexity, and associate each of them to a corresponding HEC layer.
Then, we design an adaptive model selection scheme that is formulated as a contextual-bandit problem and solved by using a reinforcement learning policy network.
arXiv Detail & Related papers (2021-08-09T08:45:47Z) - Ensemble neuroevolution based approach for multivariate time series
anomaly detection [0.0]
In this work, a framework is shown which incorporates neuroevolution methods to boost the anomaly-detection scores of new and already known models.
The proposed framework shows that it is possible to boost most of the anomaly detection deep learning models in a reasonable time and a fully automated mode.
arXiv Detail & Related papers (2021-08-08T07:55:07Z) - Gone Fishing: Neural Active Learning with Fisher Embeddings [55.08537975896764]
There is an increasing need for active learning algorithms that are compatible with deep neural networks.
This article introduces BAIT, a practical representation of tractable, and high-performing active learning algorithm for neural networks.
arXiv Detail & Related papers (2021-06-17T17:26:31Z) - Non-Gradient Manifold Neural Network [79.44066256794187]
Deep neural network (DNN) generally takes thousands of iterations to optimize via gradient descent.
We propose a novel manifold neural network based on non-gradient optimization.
arXiv Detail & Related papers (2021-06-15T06:39:13Z) - An Ode to an ODE [78.97367880223254]
We present a new paradigm for Neural ODE algorithms, called ODEtoODE, where time-dependent parameters of the main flow evolve according to a matrix flow on the group O(d)
This nested system of two flows provides stability and effectiveness of training and provably solves the gradient vanishing-explosion problem.
arXiv Detail & Related papers (2020-06-19T22:05:19Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.