A Predictive Model Based on Transformer with Statistical Feature Embedding in Manufacturing Sensor Dataset
- URL: http://arxiv.org/abs/2407.06682v1
- Date: Tue, 9 Jul 2024 08:59:27 GMT
- Title: A Predictive Model Based on Transformer with Statistical Feature Embedding in Manufacturing Sensor Dataset
- Authors: Gyeong Taek Lee, Oh-Ran Kwon,
- Abstract summary: This study proposes a novel predictive model based on the Transformer, utilizing statistical feature embedding and window positional encoding.
The model's performance is evaluated in two problems: fault detection and virtual metrology, showing superior results compared to baseline models.
The results support the model's applicability across various manufacturing industries, demonstrating its potential for enhancing process management and yield.
- Score: 2.07180164747172
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: In the manufacturing process, sensor data collected from equipment is crucial for building predictive models to manage processes and improve productivity. However, in the field, it is challenging to gather sufficient data to build robust models. This study proposes a novel predictive model based on the Transformer, utilizing statistical feature embedding and window positional encoding. Statistical features provide an effective representation of sensor data, and the embedding enables the Transformer to learn both time- and sensor-related information. Window positional encoding captures precise time details from the feature embedding. The model's performance is evaluated in two problems: fault detection and virtual metrology, showing superior results compared to baseline models. This improvement is attributed to the efficient use of parameters, which is particularly beneficial for sensor data that often has limited sample sizes. The results support the model's applicability across various manufacturing industries, demonstrating its potential for enhancing process management and yield.
Related papers
- Learning on Transformers is Provable Low-Rank and Sparse: A One-layer Analysis [63.66763657191476]
We show that efficient numerical training and inference algorithms as low-rank computation have impressive performance for learning Transformer-based adaption.
We analyze how magnitude-based models affect generalization while improving adaption.
We conclude that proper magnitude-based has a slight on the testing performance.
arXiv Detail & Related papers (2024-06-24T23:00:58Z) - Sparse Attention-driven Quality Prediction for Production Process Optimization in Digital Twins [53.70191138561039]
We propose to deploy a digital twin of the production line by encoding its operational logic in a data-driven approach.
We adopt a quality prediction model for production process based on self-attention-enabled temporal convolutional neural networks.
Our operation experiments on a specific tobacco shredding line demonstrate that the proposed digital twin-based production process optimization method fosters seamless integration between virtual and real production lines.
arXiv Detail & Related papers (2024-05-20T09:28:23Z) - SubjectDrive: Scaling Generative Data in Autonomous Driving via Subject Control [59.20038082523832]
We present SubjectDrive, the first model proven to scale generative data production in a way that could continuously improve autonomous driving applications.
We develop a novel model equipped with a subject control mechanism, which allows the generative model to leverage diverse external data sources for producing varied and useful data.
arXiv Detail & Related papers (2024-03-28T14:07:13Z) - DetDiffusion: Synergizing Generative and Perceptive Models for Enhanced Data Generation and Perception [78.26734070960886]
Current perceptive models heavily depend on resource-intensive datasets.
We introduce perception-aware loss (P.A. loss) through segmentation, improving both quality and controllability.
Our method customizes data augmentation by extracting and utilizing perception-aware attribute (P.A. Attr) during generation.
arXiv Detail & Related papers (2024-03-20T04:58:03Z) - A Cost-Sensitive Transformer Model for Prognostics Under Highly
Imbalanced Industrial Data [1.6492989697868894]
This paper introduces a novel cost-sensitive transformer model developed as part of a systematic workflow.
We observed a substantial enhancement in performance compared to state-of-the-art methods.
Our findings highlight the potential of our method in addressing the unique challenges of failure prediction in industrial settings.
arXiv Detail & Related papers (2024-01-16T15:09:53Z) - Predictive Maintenance Model Based on Anomaly Detection in Induction
Motors: A Machine Learning Approach Using Real-Time IoT Data [0.0]
In this work, we demonstrate a novel anomaly detection system on induction motors used in pumps, compressors, fans, and other industrial machines.
We use a combination of pre-processing techniques and machine learning (ML) models with a low computational cost.
arXiv Detail & Related papers (2023-10-15T18:43:45Z) - A Transformer-based Framework For Multi-variate Time Series: A Remaining
Useful Life Prediction Use Case [4.0466311968093365]
This work proposed an encoder-transformer architecture-based framework for time series prediction.
We validated the effectiveness of the proposed framework on all four sets of the C-MAPPS benchmark dataset.
To enable the model awareness of the initial stages of the machine life and its degradation path, a novel expanding window method was proposed.
arXiv Detail & Related papers (2023-08-19T02:30:35Z) - Soft Sensing Regression Model: from Sensor to Wafer Metrology
Forecasting [2.8992789044888436]
This work focuses on the task of soft sensing regression, which uses sensor data to predict impending inspection measurements.
We proposed an LSTM-based regressor and designed two loss functions for model training.
The experimental results demonstrated that the proposed model can achieve accurate and early prediction of various types of inspections in complicated manufacturing processes.
arXiv Detail & Related papers (2023-01-21T16:54:05Z) - A Generative Approach for Production-Aware Industrial Network Traffic
Modeling [70.46446906513677]
We investigate the network traffic data generated from a laser cutting machine deployed in a Trumpf factory in Germany.
We analyze the traffic statistics, capture the dependencies between the internal states of the machine, and model the network traffic as a production state dependent process.
We compare the performance of various generative models including variational autoencoder (VAE), conditional variational autoencoder (CVAE), and generative adversarial network (GAN)
arXiv Detail & Related papers (2022-11-11T09:46:58Z) - Soft Sensing Transformer: Hundreds of Sensors are Worth a Single Word [4.829772176792801]
We demonstrate the challenges and effectiveness of modeling industrial big data by a Soft Sensing Transformer model.
We observe the similarity of a sentence structure to the sensor readings and process the multi-dimensional sensor readings in a time series in a similar manner of sentences in natural language.
The results show that transformer model outperforms the benchmark models in soft sensing field based on auto-encoder and long short-term memory (LSTM) models.
arXiv Detail & Related papers (2021-11-10T22:31:32Z) - Visformer: The Vision-friendly Transformer [105.52122194322592]
We propose a new architecture named Visformer, which is abbreviated from the Vision-friendly Transformer'
With the same computational complexity, Visformer outperforms both the Transformer-based and convolution-based models in terms of ImageNet classification accuracy.
arXiv Detail & Related papers (2021-04-26T13:13:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.