Homological Time Series Analysis of Sensor Signals from Power Plants
- URL: http://arxiv.org/abs/2106.02493v1
- Date: Thu, 3 Jun 2021 10:52:47 GMT
- Title: Homological Time Series Analysis of Sensor Signals from Power Plants
- Authors: Luciano Melodia, Richard Lenz
- Abstract summary: We use topological data analysis techniques to construct a suitable neural network classifier for the task of learning sensor signals of entire power plants.
We derive architectures with deep one-dimensional convolutional layers combined with stacked long short-term memories.
For validation, numerical experiments were performed with sensor data from four power plants of the same construction type.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we use topological data analysis techniques to construct a
suitable neural network classifier for the task of learning sensor signals of
entire power plants according to their reference designation system. We use
representations of persistence diagrams to derive necessary preprocessing steps
and visualize the large amounts of data. We derive architectures with deep
one-dimensional convolutional layers combined with stacked long short-term
memories as residual networks suitable for processing the persistence features.
We combine three separate sub-networks, obtaining as input the time series
itself and a representation of the persistent homology for the zeroth and first
dimension. We give a mathematical derivation for most of the used
hyper-parameters. For validation, numerical experiments were performed with
sensor data from four power plants of the same construction type.
Related papers
- TCCT-Net: Two-Stream Network Architecture for Fast and Efficient Engagement Estimation via Behavioral Feature Signals [58.865901821451295]
We present a novel two-stream feature fusion "Tensor-Convolution and Convolution-Transformer Network" (TCCT-Net) architecture.
To better learn the meaningful patterns in the temporal-spatial domain, we design a "CT" stream that integrates a hybrid convolutional-transformer.
In parallel, to efficiently extract rich patterns from the temporal-frequency domain, we introduce a "TC" stream that uses Continuous Wavelet Transform (CWT) to represent information in a 2D tensor form.
arXiv Detail & Related papers (2024-04-15T06:01:48Z) - Importance attribution in neural networks by means of persistence
landscapes of time series [0.5156484100374058]
We include a gating layer in the network's architecture that is able to identify the most relevant landscape levels for the classification task.
We reconstruct an approximate shape of the time series that gives insight into the classification decision.
arXiv Detail & Related papers (2023-02-06T21:43:39Z) - Learning to Learn with Generative Models of Neural Network Checkpoints [71.06722933442956]
We construct a dataset of neural network checkpoints and train a generative model on the parameters.
We find that our approach successfully generates parameters for a wide range of loss prompts.
We apply our method to different neural network architectures and tasks in supervised and reinforcement learning.
arXiv Detail & Related papers (2022-09-26T17:59:58Z) - HyperTime: Implicit Neural Representation for Time Series [131.57172578210256]
Implicit neural representations (INRs) have recently emerged as a powerful tool that provides an accurate and resolution-independent encoding of data.
In this paper, we analyze the representation of time series using INRs, comparing different activation functions in terms of reconstruction accuracy and training convergence speed.
We propose a hypernetwork architecture that leverages INRs to learn a compressed latent representation of an entire time series dataset.
arXiv Detail & Related papers (2022-08-11T14:05:51Z) - PredRNN: A Recurrent Neural Network for Spatiotemporal Predictive
Learning [109.84770951839289]
We present PredRNN, a new recurrent network for learning visual dynamics from historical context.
We show that our approach obtains highly competitive results on three standard datasets.
arXiv Detail & Related papers (2021-03-17T08:28:30Z) - Deep Cellular Recurrent Network for Efficient Analysis of Time-Series
Data with Spatial Information [52.635997570873194]
This work proposes a novel deep cellular recurrent neural network (DCRNN) architecture to process complex multi-dimensional time series data with spatial information.
The proposed architecture achieves state-of-the-art performance while utilizing substantially less trainable parameters when compared to comparable methods in the literature.
arXiv Detail & Related papers (2021-01-12T20:08:18Z) - Encoded Prior Sliced Wasserstein AutoEncoder for learning latent
manifold representations [0.7614628596146599]
We introduce an Encoded Prior Sliced Wasserstein AutoEncoder.
An additional prior-encoder network learns an embedding of the data manifold.
We show that the prior encodes the geometry underlying the data unlike conventional autoencoders.
arXiv Detail & Related papers (2020-10-02T14:58:54Z) - Pre-Trained Models for Heterogeneous Information Networks [57.78194356302626]
We propose a self-supervised pre-training and fine-tuning framework, PF-HIN, to capture the features of a heterogeneous information network.
PF-HIN consistently and significantly outperforms state-of-the-art alternatives on each of these tasks, on four datasets.
arXiv Detail & Related papers (2020-07-07T03:36:28Z) - Handling Variable-Dimensional Time Series with Graph Neural Networks [20.788813485815698]
Internet of Things (IoT) technology involves capturing data from multiple sensors resulting in multi-sensor time series.
Existing neural networks based approaches for such multi-sensor time series modeling assume fixed input dimension or number of sensors.
We consider training neural network models from such multi-sensor time series, where the time series have varying input dimensionality owing to availability or installation of a different subset of sensors at each source of time series.
arXiv Detail & Related papers (2020-07-01T12:11:16Z) - Sensor selection on graphs via data-driven node sub-sampling in network
time series [0.0]
This paper is concerned by the problem of selecting an optimal sampling set of sensors over a network of time series.
We propose and compare various data-driven strategies to turn off a fixed number of sensors or equivalently to select a sampling set of nodes.
To illustrate the performances of our approach, we report numerical experiments on the analysis of real data from bike sharing networks in different cities.
arXiv Detail & Related papers (2020-04-24T15:51:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.