LAVARNET: Neural Network Modeling of Causal Variable Relationships for
Multivariate Time Series Forecasting
- URL: http://arxiv.org/abs/2009.00945v1
- Date: Wed, 2 Sep 2020 10:57:28 GMT
- Title: LAVARNET: Neural Network Modeling of Causal Variable Relationships for
Multivariate Time Series Forecasting
- Authors: Christos Koutlis, Symeon Papadopoulos, Manos Schinas, Ioannis
Kompatsiaris
- Abstract summary: A novel neural network-based architecture is proposed, termed LAgged VAriable NETwork.
It intrinsically estimates the importance of latent lagged variables and combines high dimensional representations of them to predict future values time series.
Our model is compared with other baseline and state of the art neural network architectures on one simulated data set and four real data sets from meteorology, music, solar activity, finance areas.
- Score: 18.89688469820947
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Multivariate time series forecasting is of great importance to many
scientific disciplines and industrial sectors. The evolution of a multivariate
time series depends on the dynamics of its variables and the connectivity
network of causal interrelationships among them. Most of the existing time
series models do not account for the causal effects among the system's
variables and even if they do they rely just on determining the
between-variables causality network. Knowing the structure of such a complex
network and even more specifically knowing the exact lagged variables that
contribute to the underlying process is crucial for the task of multivariate
time series forecasting. The latter is a rather unexplored source of
information to leverage. In this direction, here a novel neural network-based
architecture is proposed, termed LAgged VAriable Representation NETwork
(LAVARNET), which intrinsically estimates the importance of lagged variables
and combines high dimensional latent representations of them to predict future
values of time series. Our model is compared with other baseline and state of
the art neural network architectures on one simulated data set and four real
data sets from meteorology, music, solar activity, and finance areas. The
proposed architecture outperforms the competitive architectures in most of the
experiments.
Related papers
- Multi-Source Knowledge-Based Hybrid Neural Framework for Time Series Representation Learning [2.368662284133926]
The proposed hybrid architecture addresses limitations by combining both domain-specific knowledge and implicit knowledge of the relational structure underlying the MTS data.
The architecture shows promising results on multiple benchmark datasets, outperforming state-of-the-art forecasting methods.
arXiv Detail & Related papers (2024-08-22T13:58:55Z) - Rethinking Urban Mobility Prediction: A Super-Multivariate Time Series
Forecasting Approach [71.67506068703314]
Long-term urban mobility predictions play a crucial role in the effective management of urban facilities and services.
Traditionally, urban mobility data has been structured as videos, treating longitude and latitude as fundamental pixels.
In our research, we introduce a fresh perspective on urban mobility prediction.
Instead of oversimplifying urban mobility data as traditional video data, we regard it as a complex time series.
arXiv Detail & Related papers (2023-12-04T07:39:05Z) - Online Evolutionary Neural Architecture Search for Multivariate
Non-Stationary Time Series Forecasting [72.89994745876086]
This work presents the Online Neuro-Evolution-based Neural Architecture Search (ONE-NAS) algorithm.
ONE-NAS is a novel neural architecture search method capable of automatically designing and dynamically training recurrent neural networks (RNNs) for online forecasting tasks.
Results demonstrate that ONE-NAS outperforms traditional statistical time series forecasting methods.
arXiv Detail & Related papers (2023-02-20T22:25:47Z) - Learning the Evolutionary and Multi-scale Graph Structure for
Multivariate Time Series Forecasting [50.901984244738806]
We show how to model the evolutionary and multi-scale interactions of time series.
In particular, we first provide a hierarchical graph structure cooperated with the dilated convolution to capture the scale-specific correlations.
A unified neural network is provided to integrate the components above to get the final prediction.
arXiv Detail & Related papers (2022-06-28T08:11:12Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Deep Cellular Recurrent Network for Efficient Analysis of Time-Series
Data with Spatial Information [52.635997570873194]
This work proposes a novel deep cellular recurrent neural network (DCRNN) architecture to process complex multi-dimensional time series data with spatial information.
The proposed architecture achieves state-of-the-art performance while utilizing substantially less trainable parameters when compared to comparable methods in the literature.
arXiv Detail & Related papers (2021-01-12T20:08:18Z) - Pay Attention to Evolution: Time Series Forecasting with Deep
Graph-Evolution Learning [33.79957892029931]
This work presents a novel neural network architecture for time-series forecasting.
We named our method Recurrent Graph Evolution Neural Network (ReGENN)
An extensive set of experiments was conducted comparing ReGENN with dozens of ensemble methods and classical statistical ones.
arXiv Detail & Related papers (2020-08-28T20:10:07Z) - Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks [91.65637773358347]
We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
arXiv Detail & Related papers (2020-05-24T04:02:18Z) - ForecastNet: A Time-Variant Deep Feed-Forward Neural Network
Architecture for Multi-Step-Ahead Time-Series Forecasting [6.043572971237165]
We propose ForecastNet, which uses a deep feed-forward architecture to provide a time-variant model.
ForecastNet is demonstrated to outperform statistical and deep learning benchmark models on several datasets.
arXiv Detail & Related papers (2020-02-11T01:03:33Z) - Temporal Tensor Transformation Network for Multivariate Time Series
Prediction [1.2354076490479515]
We present a novel deep learning architecture, known as Temporal Transformation Network, which transforms the original time series into a higher order.
This yields a new representation of the original multivariate time series, which enables the convolution kernel to extract complex and non-linear features as well as variable interactional signals from a relatively large temporal region.
Experimental results show that Temporal Transformation Network outperforms several state-of-the-art methods on window-based predictions across various tasks.
arXiv Detail & Related papers (2020-01-04T07:28:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.