A Transformer-Based Approach for Diagnosing Fault Cases in Optical Fiber Amplifiers
- URL: http://arxiv.org/abs/2505.06245v2
- Date: Mon, 23 Jun 2025 06:06:01 GMT
- Title: A Transformer-Based Approach for Diagnosing Fault Cases in Optical Fiber Amplifiers
- Authors: Dominic Schneider, Lutz Rapp, Christoph Ament,
- Abstract summary: A transformer-based deep learning approach is presented that enables the diagnosis of fault cases in optical fiber amplifiers using condition-based monitoring time series data.<n>The model, Inverse Triple-Aspect Self-Attention Transformer (ITST), uses an encoder-decoder architecture, utilizing three feature extraction paths in the encoder, feature-engineered data for the decoder and a self-attention mechanism.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: A transformer-based deep learning approach is presented that enables the diagnosis of fault cases in optical fiber amplifiers using condition-based monitoring time series data. The model, Inverse Triple-Aspect Self-Attention Transformer (ITST), uses an encoder-decoder architecture, utilizing three feature extraction paths in the encoder, feature-engineered data for the decoder and a self-attention mechanism. The results show that ITST outperforms state-of-the-art models in terms of classification accuracy, which enables predictive maintenance for optical fiber amplifiers, reducing network downtimes and maintenance costs.
Related papers
- Sparse Low-Ranked Self-Attention Transformer for Remaining Useful Lifetime Prediction of Optical Fiber Amplifiers [0.0]
We propose Sparse Low-ranked self-Attention Transformer (SLAT) as a novel Remaining useful lifetime (RUL) prediction method.<n>SLAT is based on an encoder-decoder architecture, wherein two parallel working encoders extract features for sensors and time steps.<n>The implementation of sparsity in the attention matrix and a low-rank parametrization reduce overfitting and increase generalization.
arXiv Detail & Related papers (2024-09-22T09:48:45Z) - Towards Long-Term Time-Series Forecasting: Feature, Pattern, and
Distribution [57.71199089609161]
Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning.
Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism.
We propose an efficient Transformerbased model, named Conformer, which differentiates itself from existing methods for LTTF in three aspects.
arXiv Detail & Related papers (2023-01-05T13:59:29Z) - Data-driven design of fault diagnosis for three-phase PWM rectifier
using random forests technique with transient synthetic features [2.382536770336505]
A three-phase pulse-width modulation (PWM) can usually maintain operation when opencircuit faults occur in insulated-gate bipolar transistors (IGBTs)
Data-driven online fault diagnosis method is proposed to locate the open-circuit faults of IGBTs timely effectively.
arXiv Detail & Related papers (2022-11-02T05:48:30Z) - Fault diagnosis for three-phase PWM rectifier based on deep feedforward
network with transient synthetic features [0.0]
A fault diagnosis method based on deep feedforward network with transient synthetic features is proposed.
The average fault diagnosis accuracy can reach 97.85% for transient synthetic fault data.
Online fault diagnosis experiments show that the method can accurately locate the fault IGBTs.
arXiv Detail & Related papers (2022-11-01T02:32:20Z) - NAF: Neural Attenuation Fields for Sparse-View CBCT Reconstruction [79.13750275141139]
This paper proposes a novel and fast self-supervised solution for sparse-view CBCT reconstruction.
The desired attenuation coefficients are represented as a continuous function of 3D spatial coordinates, parameterized by a fully-connected deep neural network.
A learning-based encoder entailing hash coding is adopted to help the network capture high-frequency details.
arXiv Detail & Related papers (2022-09-29T04:06:00Z) - A Robust and Explainable Data-Driven Anomaly Detection Approach For
Power Electronics [56.86150790999639]
We present two anomaly detection and classification approaches, namely the Matrix Profile algorithm and anomaly transformer.
The Matrix Profile algorithm is shown to be well suited as a generalizable approach for detecting real-time anomalies in streaming time-series data.
A series of custom filters is created and added to the detector to tune its sensitivity, recall, and detection accuracy.
arXiv Detail & Related papers (2022-09-23T06:09:35Z) - Denoising Diffusion Error Correction Codes [92.10654749898927]
Recently, neural decoders have demonstrated their advantage over classical decoding techniques.
Recent state-of-the-art neural decoders suffer from high complexity and lack the important iterative scheme characteristic of many legacy decoders.
We propose to employ denoising diffusion models for the soft decoding of linear codes at arbitrary block lengths.
arXiv Detail & Related papers (2022-09-16T11:00:50Z) - Focused Decoding Enables 3D Anatomical Detection by Transformers [64.36530874341666]
We propose a novel Detection Transformer for 3D anatomical structure detection, dubbed Focused Decoder.
Focused Decoder leverages information from an anatomical region atlas to simultaneously deploy query anchors and restrict the cross-attention's field of view.
We evaluate our proposed approach on two publicly available CT datasets and demonstrate that Focused Decoder not only provides strong detection results and thus alleviates the need for a vast amount of annotated data but also exhibits exceptional and highly intuitive explainability of results via attention weights.
arXiv Detail & Related papers (2022-07-21T22:17:21Z) - A novel Time-frequency Transformer and its Application in Fault
Diagnosis of Rolling Bearings [0.24214594180459362]
We propose a novel time-frequency Transformer (TFT) model inspired by the massive success of standard Transformer in sequence processing.
A new end-to-end fault diagnosis framework based on TFT is presented in this paper.
arXiv Detail & Related papers (2021-04-19T06:53:31Z) - On the Sub-Layer Functionalities of Transformer Decoder [74.83087937309266]
We study how Transformer-based decoders leverage information from the source and target languages.
Based on these insights, we demonstrate that the residual feed-forward module in each Transformer decoder layer can be dropped with minimal loss of performance.
arXiv Detail & Related papers (2020-10-06T11:50:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.