T4PdM: a Deep Neural Network based on the Transformer Architecture for
Fault Diagnosis of Rotating Machinery
- URL: http://arxiv.org/abs/2204.03725v1
- Date: Thu, 7 Apr 2022 20:31:45 GMT
- Title: T4PdM: a Deep Neural Network based on the Transformer Architecture for
Fault Diagnosis of Rotating Machinery
- Authors: Erick Giovani Sperandio Nascimento, Julian Santana Liang, Ilan Sousa
Figueiredo, Lilian Lefol Nani Guarieiro
- Abstract summary: This paper develops an automatic fault classifier model for predictive maintenance based on a modified version of the Transformer architecture, namely T4PdM.
T4PdM was able to achieve an overall accuracy of 99.98% and 98% for both datasets.
It has demonstrated the superiority of the model in detecting and classifying faults in rotating industrial machinery.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Deep learning and big data algorithms have become widely used in industrial
applications to optimize several tasks in many complex systems. Particularly,
deep learning model for diagnosing and prognosing machinery health has
leveraged predictive maintenance (PdM) to be more accurate and reliable in
decision making, in this way avoiding unnecessary interventions, machinery
accidents, and environment catastrophes. Recently, Transformer Neural Networks
have gained notoriety and have been increasingly the favorite choice for
Natural Language Processing (NLP) tasks. Thus, given their recent major
achievements in NLP, this paper proposes the development of an automatic fault
classifier model for predictive maintenance based on a modified version of the
Transformer architecture, namely T4PdM, to identify multiple types of faults in
rotating machinery. Experimental results are developed and presented for the
MaFaulDa and CWRU databases. T4PdM was able to achieve an overall accuracy of
99.98% and 98% for both datasets, respectively. In addition, the performance of
the proposed model is compared to other previously published works. It has
demonstrated the superiority of the model in detecting and classifying faults
in rotating industrial machinery. Therefore, the proposed Transformer-based
model can improve the performance of machinery fault analysis and diagnostic
processes and leverage companies to a new era of the Industry 4.0. In addition,
this methodology can be adapted to any other task of time series
classification.
Related papers
- RmGPT: Rotating Machinery Generative Pretrained Model [20.52039868199533]
We propose RmGPT, a unified model for diagnosis and prognosis tasks.
RmGPT introduces a novel token-based framework, incorporating Signal Tokens, Prompt Tokens, Time-Frequency Task Tokens and Fault Tokens.
In experiments, RmGPT significantly outperforms state-of-the-art algorithms, achieving near-perfect accuracy in diagnosis tasks and exceptionally low errors in prognosis tasks.
arXiv Detail & Related papers (2024-09-26T07:40:47Z) - Benchmarking Neural Decoding Backbones towards Enhanced On-edge iBCI Applications [28.482461973598593]
This study seeks to identify an optimal neural decoding backbone that boasts robust performance and swift inference capabilities suitable for edge deployment.
We evaluated four prospective models, Gated Recurrent Unit (GRU), Transformer, Receptance Weighted Key Value (RWKV), and Selective State Space model (Mamba)
The findings indicate that although the GRU model delivers sufficient accuracy, the RWKV and Mamba models are preferable due to their superior inference and calibration speeds.
arXiv Detail & Related papers (2024-06-08T02:45:36Z) - Predictive Maintenance Model Based on Anomaly Detection in Induction
Motors: A Machine Learning Approach Using Real-Time IoT Data [0.0]
In this work, we demonstrate a novel anomaly detection system on induction motors used in pumps, compressors, fans, and other industrial machines.
We use a combination of pre-processing techniques and machine learning (ML) models with a low computational cost.
arXiv Detail & Related papers (2023-10-15T18:43:45Z) - A Transformer-based Framework For Multi-variate Time Series: A Remaining
Useful Life Prediction Use Case [4.0466311968093365]
This work proposed an encoder-transformer architecture-based framework for time series prediction.
We validated the effectiveness of the proposed framework on all four sets of the C-MAPPS benchmark dataset.
To enable the model awareness of the initial stages of the machine life and its degradation path, a novel expanding window method was proposed.
arXiv Detail & Related papers (2023-08-19T02:30:35Z) - Differential Evolution Algorithm based Hyper-Parameters Selection of
Transformer Neural Network Model for Load Forecasting [0.0]
Transformer models have the potential to improve Load forecasting because of their ability to learn long-range dependencies derived from their Attention Mechanism.
Our work compares the proposed Transformer based Neural Network model integrated with different metaheuristic algorithms by their performance in Load forecasting based on numerical metrics such as Mean Squared Error (MSE) and Mean Absolute Percentage Error (MAPE)
arXiv Detail & Related papers (2023-07-28T04:29:53Z) - A Comprehensive Survey on Applications of Transformers for Deep Learning
Tasks [60.38369406877899]
Transformer is a deep neural network that employs a self-attention mechanism to comprehend the contextual relationships within sequential data.
transformer models excel in handling long dependencies between input sequence elements and enable parallel processing.
Our survey encompasses the identification of the top five application domains for transformer-based models.
arXiv Detail & Related papers (2023-06-11T23:13:51Z) - End-to-End Meta-Bayesian Optimisation with Transformer Neural Processes [52.818579746354665]
This paper proposes the first end-to-end differentiable meta-BO framework that generalises neural processes to learn acquisition functions via transformer architectures.
We enable this end-to-end framework with reinforcement learning (RL) to tackle the lack of labelled acquisition data.
arXiv Detail & Related papers (2023-05-25T10:58:46Z) - Transformer-based approaches to Sentiment Detection [55.41644538483948]
We examined the performance of four different types of state-of-the-art transformer models for text classification.
The RoBERTa transformer model performs best on the test dataset with a score of 82.6% and is highly recommended for quality predictions.
arXiv Detail & Related papers (2023-03-13T17:12:03Z) - Efficient pre-training objectives for Transformers [84.64393460397471]
We study several efficient pre-training objectives for Transformers-based models.
We prove that eliminating the MASK token and considering the whole output during the loss are essential choices to improve performance.
arXiv Detail & Related papers (2021-04-20T00:09:37Z) - Anomaly Detection Based on Selection and Weighting in Latent Space [73.01328671569759]
We propose a novel selection-and-weighting-based anomaly detection framework called SWAD.
Experiments on both benchmark and real-world datasets have shown the effectiveness and superiority of SWAD.
arXiv Detail & Related papers (2021-03-08T10:56:38Z) - TELESTO: A Graph Neural Network Model for Anomaly Classification in
Cloud Services [77.454688257702]
Machine learning (ML) and artificial intelligence (AI) are applied on IT system operation and maintenance.
One direction aims at the recognition of re-occurring anomaly types to enable remediation automation.
We propose a method that is invariant to dimensionality changes of given data.
arXiv Detail & Related papers (2021-02-25T14:24:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.