SSDNet: State Space Decomposition Neural Network for Time Series
Forecasting
- URL: http://arxiv.org/abs/2112.10251v1
- Date: Sun, 19 Dec 2021 20:35:16 GMT
- Title: SSDNet: State Space Decomposition Neural Network for Time Series
Forecasting
- Authors: Yang Lin, Irena Koprinska, Mashud Rana
- Abstract summary: SSDNet is a novel deep learning approach for time series forecasting.
Transformer architecture is used to learn the temporal patterns and estimate the parameters of the state space model.
We show that SSDNet is an effective method in terms of accuracy and speed, outperforming state-of-the-art deep learning and statistical methods.
- Score: 5.311025156596578
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper, we present SSDNet, a novel deep learning approach for time
series forecasting. SSDNet combines the Transformer architecture with state
space models to provide probabilistic and interpretable forecasts, including
trend and seasonality components and previous time steps important for the
prediction. The Transformer architecture is used to learn the temporal patterns
and estimate the parameters of the state space model directly and efficiently,
without the need for Kalman filters. We comprehensively evaluate the
performance of SSDNet on five data sets, showing that SSDNet is an effective
method in terms of accuracy and speed, outperforming state-of-the-art deep
learning and statistical methods, and able to provide meaningful trend and
seasonality components.
Related papers
- Task-Oriented Real-time Visual Inference for IoVT Systems: A Co-design Framework of Neural Networks and Edge Deployment [61.20689382879937]
Task-oriented edge computing addresses this by shifting data analysis to the edge.
Existing methods struggle to balance high model performance with low resource consumption.
We propose a novel co-design framework to optimize neural network architecture.
arXiv Detail & Related papers (2024-10-29T19:02:54Z) - Context-Conditioned Spatio-Temporal Predictive Learning for Reliable V2V Channel Prediction [25.688521281119037]
Vehicle-to-Vehicle (V2V) channel state information (CSI) prediction is challenging and crucial for optimizing downstream tasks.
Traditional prediction approaches focus on four-dimensional (4D) CSI, which includes predictions over time, bandwidth, and antenna (TX and RX) space.
We propose a novel context-conditionedtemporal predictive learning method to capture dependencies within 4D CSI data.
arXiv Detail & Related papers (2024-09-16T04:15:36Z) - TCCT-Net: Two-Stream Network Architecture for Fast and Efficient Engagement Estimation via Behavioral Feature Signals [58.865901821451295]
We present a novel two-stream feature fusion "Tensor-Convolution and Convolution-Transformer Network" (TCCT-Net) architecture.
To better learn the meaningful patterns in the temporal-spatial domain, we design a "CT" stream that integrates a hybrid convolutional-transformer.
In parallel, to efficiently extract rich patterns from the temporal-frequency domain, we introduce a "TC" stream that uses Continuous Wavelet Transform (CWT) to represent information in a 2D tensor form.
arXiv Detail & Related papers (2024-04-15T06:01:48Z) - A-SDM: Accelerating Stable Diffusion through Redundancy Removal and
Performance Optimization [54.113083217869516]
In this work, we first explore the computational redundancy part of the network.
We then prune the redundancy blocks of the model and maintain the network performance.
Thirdly, we propose a global-regional interactive (GRI) attention to speed up the computationally intensive attention part.
arXiv Detail & Related papers (2023-12-24T15:37:47Z) - Disentangling Spatial and Temporal Learning for Efficient Image-to-Video
Transfer Learning [59.26623999209235]
We present DiST, which disentangles the learning of spatial and temporal aspects of videos.
The disentangled learning in DiST is highly efficient because it avoids the back-propagation of massive pre-trained parameters.
Extensive experiments on five benchmarks show that DiST delivers better performance than existing state-of-the-art methods by convincing gaps.
arXiv Detail & Related papers (2023-09-14T17:58:33Z) - Deep Learning for Day Forecasts from Sparse Observations [60.041805328514876]
Deep neural networks offer an alternative paradigm for modeling weather conditions.
MetNet-3 learns from both dense and sparse data sensors and makes predictions up to 24 hours ahead for precipitation, wind, temperature and dew point.
MetNet-3 has a high temporal and spatial resolution, respectively, up to 2 minutes and 1 km as well as a low operational latency.
arXiv Detail & Related papers (2023-06-06T07:07:54Z) - Disentangling Structured Components: Towards Adaptive, Interpretable and
Scalable Time Series Forecasting [52.47493322446537]
We develop a adaptive, interpretable and scalable forecasting framework, which seeks to individually model each component of the spatial-temporal patterns.
SCNN works with a pre-defined generative process of MTS, which arithmetically characterizes the latent structure of the spatial-temporal patterns.
Extensive experiments are conducted to demonstrate that SCNN can achieve superior performance over state-of-the-art models on three real-world datasets.
arXiv Detail & Related papers (2023-05-22T13:39:44Z) - HigeNet: A Highly Efficient Modeling for Long Sequence Time Series
Prediction in AIOps [30.963758935255075]
In this paper, we propose a highly efficient model named HigeNet to predict the long-time sequence time series.
We show that training time, resource usage and accuracy of the model are found to be significantly better than five state-of-the-art competing models.
arXiv Detail & Related papers (2022-11-13T13:48:43Z) - Pre-training Enhanced Spatial-temporal Graph Neural Network for
Multivariate Time Series Forecasting [13.441945545904504]
We propose a novel framework, in which STGNN is Enhanced by a scalable time series Pre-training model (STEP)
Specifically, we design a pre-training model to efficiently learn temporal patterns from very long-term history time series.
Our framework is capable of significantly enhancing downstream STGNNs, and our pre-training model aptly captures temporal patterns.
arXiv Detail & Related papers (2022-06-18T04:24:36Z) - Spatiotemporal convolutional network for time-series prediction and
causal inference [21.895413699349966]
A neural network computing framework, i.N.N., was developed to efficiently and accurately render a multistep-ahead prediction of a time series.
The framework has great potential in practical applications in artificial intelligence (AI) or machine learning fields.
arXiv Detail & Related papers (2021-07-03T06:20:43Z) - ForecastNet: A Time-Variant Deep Feed-Forward Neural Network
Architecture for Multi-Step-Ahead Time-Series Forecasting [6.043572971237165]
We propose ForecastNet, which uses a deep feed-forward architecture to provide a time-variant model.
ForecastNet is demonstrated to outperform statistical and deep learning benchmark models on several datasets.
arXiv Detail & Related papers (2020-02-11T01:03:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.