Respecting Time Series Properties Makes Deep Time Series Forecasting
Perfect
- URL: http://arxiv.org/abs/2207.10941v1
- Date: Fri, 22 Jul 2022 08:34:31 GMT
- Title: Respecting Time Series Properties Makes Deep Time Series Forecasting
Perfect
- Authors: Li Shen, Yuning Wei and Yangzhu Wang
- Abstract summary: How to handle time features shall be the core question of any time series forecasting model.
In this paper, we rigorously analyze three prevalent but deficient/unfounded deep time series forecasting mechanisms.
We propose a novel time series forecasting network, i.e. RTNet, on the basis of aforementioned analysis.
- Score: 3.830797055092574
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: How to handle time features shall be the core question of any time series
forecasting model. Ironically, it is often ignored or misunderstood by
deep-learning based models, even those baselines which are state-of-the-art.
This behavior makes their inefficient, untenable and unstable. In this paper,
we rigorously analyze three prevalent but deficient/unfounded deep time series
forecasting mechanisms or methods from the view of time series properties,
including normalization methods, multivariate forecasting and input sequence
length. Corresponding corollaries and solutions are given on both empirical and
theoretical basis. We thereby propose a novel time series forecasting network,
i.e. RTNet, on the basis of aforementioned analysis. It is general enough to be
combined with both supervised and self-supervised forecasting format. Thanks to
the core idea of respecting time series properties, no matter in which
forecasting format, RTNet shows obviously superior forecasting performances
compared with dozens of other SOTA time series forecasting baselines in three
real-world benchmark datasets. By and large, it even occupies less time
complexity and memory usage while acquiring better forecasting accuracy. The
source code is available at https://github.com/OrigamiSL/RTNet.
Related papers
- Unified Training of Universal Time Series Forecasting Transformers [104.56318980466742]
We present a Masked-based Universal Time Series Forecasting Transformer (Moirai)
Moirai is trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains.
Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models.
arXiv Detail & Related papers (2024-02-04T20:00:45Z) - Pushing the Limits of Pre-training for Time Series Forecasting in the
CloudOps Domain [54.67888148566323]
We introduce three large-scale time series forecasting datasets from the cloud operations domain.
We show it is a strong zero-shot baseline and benefits from further scaling, both in model and dataset size.
Accompanying these datasets and results is a suite of comprehensive benchmark results comparing classical and deep learning baselines to our pre-trained method.
arXiv Detail & Related papers (2023-10-08T08:09:51Z) - Time Series Forecasting via Semi-Asymmetric Convolutional Architecture
with Global Atrous Sliding Window [0.0]
The proposed method in this paper is designed to address the problem of time series forecasting.
Most of modern models only focus on a short range of information, which are fatal for problems such as time series forecasting.
We make three main contributions that are experimentally verified to have performance advantages.
arXiv Detail & Related papers (2023-01-31T15:07:31Z) - Generative Time Series Forecasting with Diffusion, Denoise, and
Disentanglement [51.55157852647306]
Time series forecasting has been a widely explored task of great importance in many applications.
It is common that real-world time series data are recorded in a short time period, which results in a big gap between the deep model and the limited and noisy time series.
We propose to address the time series forecasting problem with generative modeling and propose a bidirectional variational auto-encoder equipped with diffusion, denoise, and disentanglement.
arXiv Detail & Related papers (2023-01-08T12:20:46Z) - Retrieval Based Time Series Forecasting [37.48394754614059]
Time series data appears in a variety of applications such as smart transportation and environmental monitoring.
One of the fundamental problems for time series analysis is time series forecasting.
We show both theoretically and empirically that the uncertainty could be effectively reduced by retrieving relevant time series as references.
arXiv Detail & Related papers (2022-09-27T16:43:55Z) - Split Time Series into Patches: Rethinking Long-term Series Forecasting
with Dateformer [17.454822366228335]
Time is one of the most significant characteristics of time-series, yet has received insufficient attention.
We propose Dateformer who turns attention to modeling time instead of following the above practice.
Dateformer yields state-of-the-art accuracy with a 40% remarkable relative improvement, and broadens the maximum credible forecasting range to a half-yearly level.
arXiv Detail & Related papers (2022-07-12T08:58:44Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - Meta-Forecasting by combining Global DeepRepresentations with Local
Adaptation [12.747008878068314]
We introduce a novel forecasting method called Meta Global-Local Auto-Regression (Meta-GLAR)
It adapts to each time series by learning in closed-form the mapping from the representations produced by a recurrent neural network (RNN) to one-step-ahead forecasts.
Our method is competitive with the state-of-the-art in out-of-sample forecasting accuracy reported in earlier work.
arXiv Detail & Related papers (2021-11-05T11:45:02Z) - Cluster-and-Conquer: A Framework For Time-Series Forecasting [94.63501563413725]
We propose a three-stage framework for forecasting high-dimensional time-series data.
Our framework is highly general, allowing for any time-series forecasting and clustering method to be used in each step.
When instantiated with simple linear autoregressive models, we are able to achieve state-of-the-art results on several benchmark datasets.
arXiv Detail & Related papers (2021-10-26T20:41:19Z) - Series Saliency: Temporal Interpretation for Multivariate Time Series
Forecasting [30.054015098590874]
We present the series saliency framework for temporal interpretation for time series forecasting.
By extracting the "series images" from the sliding windows of the time series, we apply the saliency map segmentation.
Our framework generates temporal interpretations for the time series forecasting task while produces accurate time series forecast.
arXiv Detail & Related papers (2020-12-16T23:48:00Z) - Predicting Temporal Sets with Deep Neural Networks [50.53727580527024]
We propose an integrated solution based on the deep neural networks for temporal sets prediction.
A unique perspective is to learn element relationship by constructing set-level co-occurrence graph.
We design an attention-based module to adaptively learn the temporal dependency of elements and sets.
arXiv Detail & Related papers (2020-06-20T03:29:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.