Online Evolutionary Neural Architecture Search for Multivariate
Non-Stationary Time Series Forecasting
- URL: http://arxiv.org/abs/2302.10347v1
- Date: Mon, 20 Feb 2023 22:25:47 GMT
- Title: Online Evolutionary Neural Architecture Search for Multivariate
Non-Stationary Time Series Forecasting
- Authors: Zimeng Lyu, Alexander Ororbia, Travis Desell
- Abstract summary: This work presents the Online Neuro-Evolution-based Neural Architecture Search (ONE-NAS) algorithm.
ONE-NAS is a novel neural architecture search method capable of automatically designing and dynamically training recurrent neural networks (RNNs) for online forecasting tasks.
Results demonstrate that ONE-NAS outperforms traditional statistical time series forecasting methods.
- Score: 72.89994745876086
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Time series forecasting (TSF) is one of the most important tasks in data
science given the fact that accurate time series (TS) predictive models play a
major role across a wide variety of domains including finance, transportation,
health care, and power systems. Real-world utilization of machine learning (ML)
typically involves (pre-)training models on collected, historical data and then
applying them to unseen data points. However, in real-world applications, time
series data streams are usually non-stationary and trained ML models usually,
over time, face the problem of data or concept drift.
To address this issue, models must be periodically retrained or redesigned,
which takes significant human and computational resources. Additionally,
historical data may not even exist to re-train or re-design model with. As a
result, it is highly desirable that models are designed and trained in an
online fashion. This work presents the Online NeuroEvolution-based Neural
Architecture Search (ONE-NAS) algorithm, which is a novel neural architecture
search method capable of automatically designing and dynamically training
recurrent neural networks (RNNs) for online forecasting tasks. Without any
pre-training, ONE-NAS utilizes populations of RNNs that are continuously
updated with new network structures and weights in response to new multivariate
input data. ONE-NAS is tested on real-world, large-scale multivariate wind
turbine data as well as the univariate Dow Jones Industrial Average (DJIA)
dataset. Results demonstrate that ONE-NAS outperforms traditional statistical
time series forecasting methods, including online linear regression, fixed long
short-term memory (LSTM) and gated recurrent unit (GRU) models trained online,
as well as state-of-the-art, online ARIMA strategies.
Related papers
- Tackling Data Heterogeneity in Federated Time Series Forecasting [61.021413959988216]
Time series forecasting plays a critical role in various real-world applications, including energy consumption prediction, disease transmission monitoring, and weather forecasting.
Most existing methods rely on a centralized training paradigm, where large amounts of data are collected from distributed devices to a central cloud server.
We propose a novel framework, Fed-TREND, to address data heterogeneity by generating informative synthetic data as auxiliary knowledge carriers.
arXiv Detail & Related papers (2024-11-24T04:56:45Z) - Continuous time recurrent neural networks: overview and application to
forecasting blood glucose in the intensive care unit [56.801856519460465]
Continuous time autoregressive recurrent neural networks (CTRNNs) are a deep learning model that account for irregular observations.
We demonstrate the application of these models to probabilistic forecasting of blood glucose in a critical care setting.
arXiv Detail & Related papers (2023-04-14T09:39:06Z) - Brain-Inspired Spiking Neural Network for Online Unsupervised Time
Series Prediction [13.521272923545409]
We present a novel Continuous Learning-based Unsupervised Recurrent Spiking Neural Network Model (CLURSNN)
CLURSNN makes online predictions by reconstructing the underlying dynamical system using Random Delay Embedding.
We show that the proposed online time series prediction methodology outperforms state-of-the-art DNN models when predicting an evolving Lorenz63 dynamical system.
arXiv Detail & Related papers (2023-04-10T16:18:37Z) - A data filling methodology for time series based on CNN and (Bi)LSTM
neural networks [0.0]
We develop two Deep Learning models aimed at filling data gaps in time series obtained from monitored apartments in Bolzano, Italy.
Our approach manages to capture the fluctuating nature of the data and shows good accuracy in reconstructing the target time series.
arXiv Detail & Related papers (2022-04-21T09:40:30Z) - An advanced spatio-temporal convolutional recurrent neural network for
storm surge predictions [73.4962254843935]
We study the capability of artificial neural network models to emulate storm surge based on the storm track/size/intensity history.
This study presents a neural network model that can predict storm surge, informed by a database of synthetic storm simulations.
arXiv Detail & Related papers (2022-04-18T23:42:18Z) - ONE-NAS: An Online NeuroEvolution based Neural Architecture Search for
Time Series Forecasting [3.3758186776249928]
This work presents the Online NeuroEvolution based Neural Architecture Search (ONE-NAS) algorithm.
ONE-NAS is the first neural architecture search algorithm capable of automatically designing and training new recurrent neural networks (RNNs) in an online setting.
It is shown to outperform traditional statistical time series forecasting, including naive, moving average, and exponential smoothing methods.
arXiv Detail & Related papers (2022-02-27T22:58:32Z) - Online learning of windmill time series using Long Short-term Cognitive
Networks [58.675240242609064]
The amount of data generated on windmill farms makes online learning the most viable strategy to follow.
We use Long Short-term Cognitive Networks (LSTCNs) to forecast windmill time series in online settings.
Our approach reported the lowest forecasting errors with respect to a simple RNN, a Long Short-term Memory, a Gated Recurrent Unit, and a Hidden Markov Model.
arXiv Detail & Related papers (2021-07-01T13:13:24Z) - Rank-R FNN: A Tensor-Based Learning Model for High-Order Data
Classification [69.26747803963907]
Rank-R Feedforward Neural Network (FNN) is a tensor-based nonlinear learning model that imposes Canonical/Polyadic decomposition on its parameters.
First, it handles inputs as multilinear arrays, bypassing the need for vectorization, and can thus fully exploit the structural information along every data dimension.
We establish the universal approximation and learnability properties of Rank-R FNN, and we validate its performance on real-world hyperspectral datasets.
arXiv Detail & Related papers (2021-04-11T16:37:32Z) - CARRNN: A Continuous Autoregressive Recurrent Neural Network for Deep
Representation Learning from Sporadic Temporal Data [1.8352113484137622]
In this paper, a novel deep learning-based model is developed for modeling multiple temporal features in sporadic data.
The proposed model, called CARRNN, uses a generalized discrete-time autoregressive model that is trainable end-to-end using neural networks modulated by time lags.
It is applied to multivariate time-series regression tasks using data provided for Alzheimer's disease progression modeling and intensive care unit (ICU) mortality rate prediction.
arXiv Detail & Related papers (2021-04-08T12:43:44Z) - Industrial Forecasting with Exponentially Smoothed Recurrent Neural
Networks [0.0]
We present a class of exponential smoothed recurrent neural networks (RNNs) which are well suited to modeling non-stationary dynamical systems arising in industrial applications.
Application of exponentially smoothed RNNs to forecasting electricity load, weather data, and stock prices highlight the efficacy of exponential smoothing of the hidden state for multi-step time series forecasting.
arXiv Detail & Related papers (2020-04-09T17:53:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.