A Deep Structural Model for Analyzing Correlated Multivariate Time
Series
- URL: http://arxiv.org/abs/2001.00559v1
- Date: Thu, 2 Jan 2020 18:48:29 GMT
- Title: A Deep Structural Model for Analyzing Correlated Multivariate Time
Series
- Authors: Changwei Hu, Yifan Hu, Sungyong Seo
- Abstract summary: We present a deep learning structural time series model which can handle correlated multivariate time series input.
The model explicitly learns/extracts the trend, seasonality, and event components.
We compare our model with several state-of-the-art methods through a comprehensive set of experiments on a variety of time series data sets.
- Score: 11.009809732645888
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Multivariate time series are routinely encountered in real-world
applications, and in many cases, these time series are strongly correlated. In
this paper, we present a deep learning structural time series model which can
(i) handle correlated multivariate time series input, and (ii) forecast the
targeted temporal sequence by explicitly learning/extracting the trend,
seasonality, and event components. The trend is learned via a 1D and 2D
temporal CNN and LSTM hierarchical neural net. The CNN-LSTM architecture can
(i) seamlessly leverage the dependency among multiple correlated time series in
a natural way, (ii) extract the weighted differencing feature for better trend
learning, and (iii) memorize the long-term sequential pattern. The seasonality
component is approximated via a non-liner function of a set of Fourier terms,
and the event components are learned by a simple linear function of regressor
encoding the event dates. We compare our model with several state-of-the-art
methods through a comprehensive set of experiments on a variety of time series
data sets, such as forecasts of Amazon AWS Simple Storage Service (S3) and
Elastic Compute Cloud (EC2) billings, and the closing prices for corporate
stocks in the same category.
Related papers
- Unified Training of Universal Time Series Forecasting Transformers [104.56318980466742]
We present a Masked-based Universal Time Series Forecasting Transformer (Moirai)
Moirai is trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains.
Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models.
arXiv Detail & Related papers (2024-02-04T20:00:45Z) - TimeSiam: A Pre-Training Framework for Siamese Time-Series Modeling [67.02157180089573]
Time series pre-training has recently garnered wide attention for its potential to reduce labeling expenses and benefit various downstream tasks.
This paper proposes TimeSiam as a simple but effective self-supervised pre-training framework for Time series based on Siamese networks.
arXiv Detail & Related papers (2024-02-04T13:10:51Z) - TPRNN: A Top-Down Pyramidal Recurrent Neural Network for Time Series
Forecasting [7.08506873242564]
Time series have multi-scale characteristics, i.e., different temporal patterns at different scales.
We propose TPRNN, a Top-down Pyramidal Recurrent Neural Network for time series forecasting.
TPRNN has achieved the state-of-the-art performance with an average improvement of 8.13% in MSE compared to the best baseline.
arXiv Detail & Related papers (2023-12-11T12:21:45Z) - FocusLearn: Fully-Interpretable, High-Performance Modular Neural Networks for Time Series [0.3277163122167434]
This paper proposes a novel modular neural network model for time series prediction that is interpretable by construction.
A recurrent neural network learns the temporal dependencies in the data while an attention-based feature selection component selects the most relevant features.
A modular deep network is trained from the selected features independently to show the users how features influence outcomes, making the model interpretable.
arXiv Detail & Related papers (2023-11-28T14:51:06Z) - TACTiS-2: Better, Faster, Simpler Attentional Copulas for Multivariate Time Series [57.4208255711412]
Building on copula theory, we propose a simplified objective for the recently-introduced transformer-based attentional copulas (TACTiS)
We show that the resulting model has significantly better training dynamics and achieves state-of-the-art performance across diverse real-world forecasting tasks.
arXiv Detail & Related papers (2023-10-02T16:45:19Z) - Features Fusion Framework for Multimodal Irregular Time-series Events [6.497816402045097]
multimodal irregular time-series events have different sampling frequencies, data compositions, temporal relations and characteristics.
In this paper, a features fusion framework for multimodal irregular time-series events is proposed based on the Long Short-Term Memory networks (LSTM)
Experiments on MIMIC-III dataset demonstrate that the proposed framework significantly outperforms to the existing methods in terms of AUC (the area under Receiver Operating Characteristic curve) and AP (Average Precision)
arXiv Detail & Related papers (2022-09-05T02:27:12Z) - HyperTime: Implicit Neural Representation for Time Series [131.57172578210256]
Implicit neural representations (INRs) have recently emerged as a powerful tool that provides an accurate and resolution-independent encoding of data.
In this paper, we analyze the representation of time series using INRs, comparing different activation functions in terms of reconstruction accuracy and training convergence speed.
We propose a hypernetwork architecture that leverages INRs to learn a compressed latent representation of an entire time series dataset.
arXiv Detail & Related papers (2022-08-11T14:05:51Z) - Time Series is a Special Sequence: Forecasting with Sample Convolution
and Interaction [9.449017120452675]
Time series is a special type of sequence data, a set of observations collected at even intervals of time and ordered chronologically.
Existing deep learning techniques use generic sequence models for time series analysis, which ignore some of its unique properties.
We propose a novel neural network architecture and apply it for the time series forecasting problem, wherein we conduct sample convolution and interaction at multiple resolutions for temporal modeling.
arXiv Detail & Related papers (2021-06-17T08:15:04Z) - Pay Attention to Evolution: Time Series Forecasting with Deep
Graph-Evolution Learning [33.79957892029931]
This work presents a novel neural network architecture for time-series forecasting.
We named our method Recurrent Graph Evolution Neural Network (ReGENN)
An extensive set of experiments was conducted comparing ReGENN with dozens of ensemble methods and classical statistical ones.
arXiv Detail & Related papers (2020-08-28T20:10:07Z) - Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks [91.65637773358347]
We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
arXiv Detail & Related papers (2020-05-24T04:02:18Z) - Convolutional Tensor-Train LSTM for Spatio-temporal Learning [116.24172387469994]
We propose a higher-order LSTM model that can efficiently learn long-term correlations in the video sequence.
This is accomplished through a novel tensor train module that performs prediction by combining convolutional features across time.
Our results achieve state-of-the-art performance-art in a wide range of applications and datasets.
arXiv Detail & Related papers (2020-02-21T05:00:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.