Online learning of windmill time series using Long Short-term Cognitive
Networks
- URL: http://arxiv.org/abs/2107.00425v1
- Date: Thu, 1 Jul 2021 13:13:24 GMT
- Title: Online learning of windmill time series using Long Short-term Cognitive
Networks
- Authors: Alejandro Morales-Hern\'andez, Gonzalo N\'apoles, Agnieszka
Jastrzebska, Yamisleydi Salgueiro, Koen Vanhoof
- Abstract summary: The amount of data generated on windmill farms makes online learning the most viable strategy to follow.
We use Long Short-term Cognitive Networks (LSTCNs) to forecast windmill time series in online settings.
Our approach reported the lowest forecasting errors with respect to a simple RNN, a Long Short-term Memory, a Gated Recurrent Unit, and a Hidden Markov Model.
- Score: 58.675240242609064
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Forecasting windmill time series is often the basis of other processes such
as anomaly detection, health monitoring, or maintenance scheduling. The amount
of data generated on windmill farms makes online learning the most viable
strategy to follow. Such settings require retraining the model each time a new
batch of data is available. However, update the model with the new information
is often very expensive to perform using traditional Recurrent Neural Networks
(RNNs). In this paper, we use Long Short-term Cognitive Networks (LSTCNs) to
forecast windmill time series in online settings. These recently introduced
neural systems consist of chained Short-term Cognitive Network blocks, each
processing a temporal data chunk. The learning algorithm of these blocks is
based on a very fast, deterministic learning rule that makes LSTCNs suitable
for online learning tasks. The numerical simulations using a case study with
four windmills showed that our approach reported the lowest forecasting errors
with respect to a simple RNN, a Long Short-term Memory, a Gated Recurrent Unit,
and a Hidden Markov Model. What is perhaps more important is that the LSTCN
approach is significantly faster than these state-of-the-art models.
Related papers
- How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Properties and Potential Applications of Random Functional-Linked Types
of Neural Networks [81.56822938033119]
Random functional-linked neural networks (RFLNNs) offer an alternative way of learning in deep structure.
This paper gives some insights into the properties of RFLNNs from the viewpoints of frequency domain.
We propose a method to generate a BLS network with better performance, and design an efficient algorithm for solving Poison's equation.
arXiv Detail & Related papers (2023-04-03T13:25:22Z) - Online Evolutionary Neural Architecture Search for Multivariate
Non-Stationary Time Series Forecasting [72.89994745876086]
This work presents the Online Neuro-Evolution-based Neural Architecture Search (ONE-NAS) algorithm.
ONE-NAS is a novel neural architecture search method capable of automatically designing and dynamically training recurrent neural networks (RNNs) for online forecasting tasks.
Results demonstrate that ONE-NAS outperforms traditional statistical time series forecasting methods.
arXiv Detail & Related papers (2023-02-20T22:25:47Z) - Pretraining Graph Neural Networks for few-shot Analog Circuit Modeling
and Design [68.1682448368636]
We present a supervised pretraining approach to learn circuit representations that can be adapted to new unseen topologies or unseen prediction tasks.
To cope with the variable topological structure of different circuits we describe each circuit as a graph and use graph neural networks (GNNs) to learn node embeddings.
We show that pretraining GNNs on prediction of output node voltages can encourage learning representations that can be adapted to new unseen topologies or prediction of new circuit level properties.
arXiv Detail & Related papers (2022-03-29T21:18:47Z) - Learning Fast and Slow for Online Time Series Forecasting [76.50127663309604]
Fast and Slow learning Networks (FSNet) is a holistic framework for online time-series forecasting.
FSNet balances fast adaptation to recent changes and retrieving similar old knowledge.
Our code will be made publicly available.
arXiv Detail & Related papers (2022-02-23T18:23:07Z) - A Comparative Study of Detecting Anomalies in Time Series Data Using
LSTM and TCN Models [2.007262412327553]
This paper compares two prominent deep learning modeling techniques.
The Recurrent Neural Network (RNN)-based Long Short-Term Memory (LSTM) and the convolutional Neural Network (CNN)-based Temporal Convolutional Networks (TCN) are compared.
arXiv Detail & Related papers (2021-12-17T02:46:55Z) - Multimodal Meta-Learning for Time Series Regression [3.135152720206844]
We will explore the idea of using meta-learning for quickly adapting model parameters to new short-history time series.
We show empirically that our proposed meta-learning method learns TSR with few data fast and outperforms the baselines in 9 of 12 experiments.
arXiv Detail & Related papers (2021-08-05T20:50:18Z) - Long Short-term Cognitive Networks [2.2748974006378933]
We present a recurrent neural system named Long Short-term Cognitive Networks (LSTCNs) as a generalisation of the Short-term Cognitive Network (STCN) model.
Our neural system reports small forecasting errors while being up to thousands of times faster than state-of-the-art recurrent models.
arXiv Detail & Related papers (2021-06-30T17:42:09Z) - On the Evaluation of Sequential Machine Learning for Network Intrusion
Detection [3.093890460224435]
We propose a detailed methodology to extract temporal sequences of NetFlows that denote patterns of malicious activities.
We then apply this methodology to compare the efficacy of sequential learning models against traditional static learning models.
arXiv Detail & Related papers (2021-06-15T08:29:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.