Deep Transformer Networks for Time Series Classification: The NPP Safety
Case
- URL: http://arxiv.org/abs/2104.05448v1
- Date: Fri, 9 Apr 2021 14:26:25 GMT
- Title: Deep Transformer Networks for Time Series Classification: The NPP Safety
Case
- Authors: Bing Zha, Alessandro Vanni, Yassin Hassan, Tunc Aldemir, Alper Yilmaz
- Abstract summary: An advanced temporal neural network referred to as the Transformer is used within a supervised learning fashion to model the time-dependent NPP simulation data.
The Transformer can learn the characteristics of the sequential data and yield promising performance with approximately 99% classification accuracy on the testing dataset.
- Score: 59.20947681019466
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: A challenging part of dynamic probabilistic risk assessment for nuclear power
plants is the need for large amounts of temporal simulations given various
initiating events and branching conditions from which representative feature
extraction becomes complicated for subsequent applications. Artificial
Intelligence techniques have been shown to be powerful tools in time-dependent
sequential data processing to automatically extract and yield complex features
from large data. An advanced temporal neural network referred to as the
Transformer is used within a supervised learning fashion to model the
time-dependent NPP simulation data and to infer whether a given sequence of
events leads to core damage or not. The training and testing datasets for the
Transformer are obtained by running 10,000 RELAP5-3D NPP blackout simulations
with the list of variables obtained from the RAVEN software. Each simulation is
classified as "OK" or "CORE DAMAGE" based on the consequence. The results show
that the Transformer can learn the characteristics of the sequential data and
yield promising performance with approximately 99% classification accuracy on
the testing dataset.
Related papers
- An Investigation on Machine Learning Predictive Accuracy Improvement and Uncertainty Reduction using VAE-based Data Augmentation [2.517043342442487]
Deep generative learning uses certain ML models to learn the underlying distribution of existing data and generate synthetic samples that resemble the real data.
In this study, our objective is to evaluate the effectiveness of data augmentation using variational autoencoder (VAE)-based deep generative models.
We investigated whether the data augmentation leads to improved accuracy in the predictions of a deep neural network (DNN) model trained using the augmented data.
arXiv Detail & Related papers (2024-10-24T18:15:48Z) - Representation Learning of Multivariate Time Series using Attention and
Adversarial Training [2.0577627277681887]
A Transformer-based autoencoder is proposed that is regularized using an adversarial training scheme to generate artificial time series signals.
Our results indicate that the generated signals exhibit higher similarity to an exemplary dataset than using a convolutional network approach.
arXiv Detail & Related papers (2024-01-03T21:32:46Z) - DuETT: Dual Event Time Transformer for Electronic Health Records [14.520791492631114]
We introduce the DuETT architecture, an extension of Transformers designed to attend over both time and event type dimensions.
DuETT uses an aggregated input where sparse time series are transformed into a regular sequence with fixed length.
Our model outperforms state-of-the-art deep learning models on multiple downstream tasks from the MIMIC-IV and PhysioNet-2012 EHR datasets.
arXiv Detail & Related papers (2023-04-25T17:47:48Z) - Towards Long-Term Time-Series Forecasting: Feature, Pattern, and
Distribution [57.71199089609161]
Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning.
Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism.
We propose an efficient Transformerbased model, named Conformer, which differentiates itself from existing methods for LTTF in three aspects.
arXiv Detail & Related papers (2023-01-05T13:59:29Z) - Transformer-based conditional generative adversarial network for
multivariate time series generation [0.0]
Conditional generation of time-dependent data is a task that has much interest.
Recent works proposed a Transformer-based Time series generative adversarial network (TTS-GAN)
We extend the TTS-GAN by conditioning its generated output on a particular encoded context.
We show that this transformer-based CGAN can generate realistic high-dimensional and long data sequences under different kinds of conditions.
arXiv Detail & Related papers (2022-10-05T08:29:33Z) - An advanced spatio-temporal convolutional recurrent neural network for
storm surge predictions [73.4962254843935]
We study the capability of artificial neural network models to emulate storm surge based on the storm track/size/intensity history.
This study presents a neural network model that can predict storm surge, informed by a database of synthetic storm simulations.
arXiv Detail & Related papers (2022-04-18T23:42:18Z) - Convolutional generative adversarial imputation networks for
spatio-temporal missing data in storm surge simulations [86.5302150777089]
Generative Adversarial Imputation Nets (GANs) and GAN-based techniques have attracted attention as unsupervised machine learning methods.
We name our proposed method as Con Conval Generative Adversarial Imputation Nets (Conv-GAIN)
arXiv Detail & Related papers (2021-11-03T03:50:48Z) - Deep Cellular Recurrent Network for Efficient Analysis of Time-Series
Data with Spatial Information [52.635997570873194]
This work proposes a novel deep cellular recurrent neural network (DCRNN) architecture to process complex multi-dimensional time series data with spatial information.
The proposed architecture achieves state-of-the-art performance while utilizing substantially less trainable parameters when compared to comparable methods in the literature.
arXiv Detail & Related papers (2021-01-12T20:08:18Z) - Learning summary features of time series for likelihood free inference [93.08098361687722]
We present a data-driven strategy for automatically learning summary features from time series data.
Our results indicate that learning summary features from data can compete and even outperform LFI methods based on hand-crafted values.
arXiv Detail & Related papers (2020-12-04T19:21:37Z) - Deep Neural Network based Wide-Area Event Classification in Power
Systems [2.2442786393371725]
Deep neural network (DNN) based classification is developed based on availability of data from time-synchronized phasor measurement units (PMUs)
The effectiveness of the proposed event classification is validated through the real-world dataset of the U.S. transmission grids.
arXiv Detail & Related papers (2020-08-24T01:32:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.