Time Series Forecasting via Semi-Asymmetric Convolutional Architecture
with Global Atrous Sliding Window
- URL: http://arxiv.org/abs/2301.13691v1
- Date: Tue, 31 Jan 2023 15:07:31 GMT
- Title: Time Series Forecasting via Semi-Asymmetric Convolutional Architecture
with Global Atrous Sliding Window
- Authors: Yuanpeng He
- Abstract summary: The proposed method in this paper is designed to address the problem of time series forecasting.
Most of modern models only focus on a short range of information, which are fatal for problems such as time series forecasting.
We make three main contributions that are experimentally verified to have performance advantages.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The proposed method in this paper is designed to address the problem of time
series forecasting. Although some exquisitely designed models achieve excellent
prediction performances, how to extract more useful information and make
accurate predictions is still an open issue. Most of modern models only focus
on a short range of information, which are fatal for problems such as time
series forecasting which needs to capture long-term information
characteristics. As a result, the main concern of this work is to further mine
relationship between local and global information contained in time series to
produce more precise predictions. In this paper, to satisfactorily realize the
purpose, we make three main contributions that are experimentally verified to
have performance advantages. Firstly, original time series is transformed into
difference sequence which serves as input to the proposed model. And secondly,
we introduce the global atrous sliding window into the forecasting model which
references the concept of fuzzy time series to associate relevant global
information with temporal data within a time period and utilizes
central-bidirectional atrous algorithm to capture underlying-related features
to ensure validity and consistency of captured data. Thirdly, a variation of
widely-used asymmetric convolution which is called semi-asymmetric convolution
is devised to more flexibly extract relationships in adjacent elements and
corresponding associated global features with adjustable ranges of convolution
on vertical and horizontal directions. The proposed model in this paper
achieves state-of-the-art on most of time series datasets provided compared
with competitive modern models.
Related papers
- Tackling Data Heterogeneity in Federated Time Series Forecasting [61.021413959988216]
Time series forecasting plays a critical role in various real-world applications, including energy consumption prediction, disease transmission monitoring, and weather forecasting.
Most existing methods rely on a centralized training paradigm, where large amounts of data are collected from distributed devices to a central cloud server.
We propose a novel framework, Fed-TREND, to address data heterogeneity by generating informative synthetic data as auxiliary knowledge carriers.
arXiv Detail & Related papers (2024-11-24T04:56:45Z) - DAM: Towards A Foundation Model for Time Series Forecasting [0.8231118867997028]
We propose a neural model that takes randomly sampled histories and outputs an adjustable basis composition as a continuous function of time.
It involves three key components: (1) a flexible approach for using randomly sampled histories from a long-tail distribution; (2) a transformer backbone that is trained on these actively sampled histories to produce, as representational output; and (3) the basis coefficients of a continuous function of time.
arXiv Detail & Related papers (2024-07-25T08:48:07Z) - Context Neural Networks: A Scalable Multivariate Model for Time Series Forecasting [5.5711773076846365]
Real-world time series often exhibit complex interdependencies that cannot be captured in isolation.
This paper introduces the Context Neural Network, an efficient linear complexity approach for augmenting time series models with relevant contextual insights.
arXiv Detail & Related papers (2024-05-12T00:21:57Z) - Unified Training of Universal Time Series Forecasting Transformers [104.56318980466742]
We present a Masked-based Universal Time Series Forecasting Transformer (Moirai)
Moirai is trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains.
Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models.
arXiv Detail & Related papers (2024-02-04T20:00:45Z) - Generative Time Series Forecasting with Diffusion, Denoise, and
Disentanglement [51.55157852647306]
Time series forecasting has been a widely explored task of great importance in many applications.
It is common that real-world time series data are recorded in a short time period, which results in a big gap between the deep model and the limited and noisy time series.
We propose to address the time series forecasting problem with generative modeling and propose a bidirectional variational auto-encoder equipped with diffusion, denoise, and disentanglement.
arXiv Detail & Related papers (2023-01-08T12:20:46Z) - Respecting Time Series Properties Makes Deep Time Series Forecasting
Perfect [3.830797055092574]
How to handle time features shall be the core question of any time series forecasting model.
In this paper, we rigorously analyze three prevalent but deficient/unfounded deep time series forecasting mechanisms.
We propose a novel time series forecasting network, i.e. RTNet, on the basis of aforementioned analysis.
arXiv Detail & Related papers (2022-07-22T08:34:31Z) - Split Time Series into Patches: Rethinking Long-term Series Forecasting
with Dateformer [17.454822366228335]
Time is one of the most significant characteristics of time-series, yet has received insufficient attention.
We propose Dateformer who turns attention to modeling time instead of following the above practice.
Dateformer yields state-of-the-art accuracy with a 40% remarkable relative improvement, and broadens the maximum credible forecasting range to a half-yearly level.
arXiv Detail & Related papers (2022-07-12T08:58:44Z) - Towards Spatio-Temporal Aware Traffic Time Series Forecasting--Full
Version [37.09531298150374]
Traffic series forecasting is challenging due to complex time series patterns for the same time series patterns may vary across time, where, for example, there exist periods across a day showing stronger temporal correlations.
Such-temporal models employ a shared parameter space irrespective of the time locations and the time periods and they assume that the temporal correlations are similar across locations and do not always hold across time which may not always be the case.
We propose a framework that aims at turning ICD-temporal aware models to encode sub-temporal models.
arXiv Detail & Related papers (2022-03-29T16:44:56Z) - Cluster-and-Conquer: A Framework For Time-Series Forecasting [94.63501563413725]
We propose a three-stage framework for forecasting high-dimensional time-series data.
Our framework is highly general, allowing for any time-series forecasting and clustering method to be used in each step.
When instantiated with simple linear autoregressive models, we are able to achieve state-of-the-art results on several benchmark datasets.
arXiv Detail & Related papers (2021-10-26T20:41:19Z) - Deep Autoregressive Models with Spectral Attention [74.08846528440024]
We propose a forecasting architecture that combines deep autoregressive models with a Spectral Attention (SA) module.
By characterizing in the spectral domain the embedding of the time series as occurrences of a random process, our method can identify global trends and seasonality patterns.
Two spectral attention models, global and local to the time series, integrate this information within the forecast and perform spectral filtering to remove time series's noise.
arXiv Detail & Related papers (2021-07-13T11:08:47Z) - Predicting Temporal Sets with Deep Neural Networks [50.53727580527024]
We propose an integrated solution based on the deep neural networks for temporal sets prediction.
A unique perspective is to learn element relationship by constructing set-level co-occurrence graph.
We design an attention-based module to adaptively learn the temporal dependency of elements and sets.
arXiv Detail & Related papers (2020-06-20T03:29:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.