Tsformer: Time series Transformer for tourism demand forecasting
- URL: http://arxiv.org/abs/2107.10977v1
- Date: Thu, 22 Jul 2021 06:33:20 GMT
- Title: Tsformer: Time series Transformer for tourism demand forecasting
- Authors: Siyuan Yi, Xing Chen, Chuanming Tang
- Abstract summary: We propose a time series Transformer (Tsformer) with the Transformer-Decoder architecture for tourism demand forecasting.
The proposed Tsformer encodes long-term dependency with encoder, captures short-term dependency with decoder, and simplifies the attention interactions.
Experiments conducted on the Jiuzhaigou valley and Siguniang mountain tourism demand datasets with other nine baseline methods indicate that the proposed Tsformer outperformed all baseline models.
- Score: 1.0883306593668278
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: AI-based methods have been widely applied to tourism demand forecasting.
However, current AI-based methods are short of the ability to process long-term
dependency, and most of them lack interpretability. The Transformer used
initially for machine translation shows an incredible ability to long-term
dependency processing. Based on the Transformer, we proposed a time series
Transformer (Tsformer) with Encoder-Decoder architecture for tourism demand
forecasting. The proposed Tsformer encodes long-term dependency with encoder,
captures short-term dependency with decoder, and simplifies the attention
interactions under the premise of highlighting dominant attention through a
series of attention masking mechanisms. These improvements make the multi-head
attention mechanism process the input sequence according to the time
relationship, contributing to better interpretability. What's more, the context
processing ability of the Encoder-Decoder architecture allows adopting the
calendar of days to be forecasted to enhance the forecasting performance.
Experiments conducted on the Jiuzhaigou valley and Siguniang mountain tourism
demand datasets with other nine baseline methods indicate that the proposed
Tsformer outperformed all baseline models in the short-term and long-term
tourism demand forecasting tasks. Moreover, ablation studies demonstrate that
the adoption of the calendar of days to be forecasted contributes to the
forecasting performance of the proposed Tsformer. For better interpretability,
the attention weight matrix visualization is performed. It indicates that the
Tsformer concentrates on seasonal features and days close to days to be
forecast in short-term forecasting.
Related papers
- TAT: Temporal-Aligned Transformer for Multi-Horizon Peak Demand Forecasting [51.37167759339485]
We propose Temporal-Aligned Transformer (TAT), a multi-horizon forecaster leveraging apriori-known context variables for improving predictive performance.<n>Our model consists of an encoder and decoder, both embedded with a novel Temporal Alignment Attention (TAA) designed to learn context-dependent alignment for peak demand forecasting.<n>We demonstrate that TAT brings up to 30% accuracy on peak demand forecasting while maintaining competitive overall performance compared to other state-of-the-art methods.
arXiv Detail & Related papers (2025-07-14T14:51:24Z) - Does Scaling Law Apply in Time Series Forecasting? [2.127584662240465]
We propose Alinear, an ultra-lightweight forecasting model that achieves competitive performance using only k-level parameters.<n>Experiments on seven benchmark datasets demonstrate that Alinear consistently outperforms large-scale models.<n>This work challenges the prevailing belief that larger models are inherently better and suggests a paradigm shift toward more efficient time series modeling.
arXiv Detail & Related papers (2025-05-15T11:04:39Z) - Ister: Inverted Seasonal-Trend Decomposition Transformer for Explainable Multivariate Time Series Forecasting [10.32586981170693]
Inverted Seasonal-Trend Decomposition Transformer (Ister)
We introduce a novel Dot-attention mechanism that improves interpretability, computational efficiency, and predictive accuracy.
Ister enables intuitive visualization of component contributions, shedding lights on model's decision process and enhancing transparency in prediction results.
arXiv Detail & Related papers (2024-12-25T06:37:19Z) - Timer-XL: Long-Context Transformers for Unified Time Series Forecasting [67.83502953961505]
We present Timer-XL, a generative Transformer for unified time series forecasting.
Timer-XL achieves state-of-the-art performance across challenging forecasting benchmarks through a unified approach.
arXiv Detail & Related papers (2024-10-07T07:27:39Z) - Local Attention Mechanism: Boosting the Transformer Architecture for Long-Sequence Time Series Forecasting [8.841114905151152]
Local Attention Mechanism (LAM) is an efficient attention mechanism tailored for time series analysis.
LAM exploits the continuity properties of time series to reduce the number of attention scores computed.
We present an algorithm for implementing LAM in algebra tensor that runs in time and memory O(nlogn)
arXiv Detail & Related papers (2024-10-04T11:32:02Z) - Multiscale Representation Enhanced Temporal Flow Fusion Model for Long-Term Workload Forecasting [19.426131129034115]
This paper proposes a novel framework leveraging self-supervised multiscale representation learning to capture both long-term and near-term workload patterns.
The long-term history is encoded through multiscale representations while the near-term observations are modeled via temporal flow fusion.
arXiv Detail & Related papers (2024-07-29T04:42:18Z) - Adapting to Length Shift: FlexiLength Network for Trajectory Prediction [53.637837706712794]
Trajectory prediction plays an important role in various applications, including autonomous driving, robotics, and scene understanding.
Existing approaches mainly focus on developing compact neural networks to increase prediction precision on public datasets, typically employing a standardized input duration.
We introduce a general and effective framework, the FlexiLength Network (FLN), to enhance the robustness of existing trajectory prediction against varying observation periods.
arXiv Detail & Related papers (2024-03-31T17:18:57Z) - Rethinking Urban Mobility Prediction: A Super-Multivariate Time Series
Forecasting Approach [71.67506068703314]
Long-term urban mobility predictions play a crucial role in the effective management of urban facilities and services.
Traditionally, urban mobility data has been structured as videos, treating longitude and latitude as fundamental pixels.
In our research, we introduce a fresh perspective on urban mobility prediction.
Instead of oversimplifying urban mobility data as traditional video data, we regard it as a complex time series.
arXiv Detail & Related papers (2023-12-04T07:39:05Z) - CARD: Channel Aligned Robust Blend Transformer for Time Series
Forecasting [50.23240107430597]
We design a special Transformer, i.e., Channel Aligned Robust Blend Transformer (CARD for short), that addresses key shortcomings of CI type Transformer in time series forecasting.
First, CARD introduces a channel-aligned attention structure that allows it to capture both temporal correlations among signals.
Second, in order to efficiently utilize the multi-scale knowledge, we design a token blend module to generate tokens with different resolutions.
Third, we introduce a robust loss function for time series forecasting to alleviate the potential overfitting issue.
arXiv Detail & Related papers (2023-05-20T05:16:31Z) - Towards Long-Term Time-Series Forecasting: Feature, Pattern, and
Distribution [57.71199089609161]
Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning.
Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism.
We propose an efficient Transformerbased model, named Conformer, which differentiates itself from existing methods for LTTF in three aspects.
arXiv Detail & Related papers (2023-01-05T13:59:29Z) - Temporal Saliency Detection Towards Explainable Transformer-based
Timeseries Forecasting [3.046315755726937]
This paper introduces Temporal Saliency Detection (TSD), an effective approach that builds upon the attention mechanism and applies it to multi-horizon time series prediction.
The TSD approach facilitates the multiresolution analysis of saliency patterns by condensing multi-heads, thereby progressively enhancing the forecasting of complex time series data.
arXiv Detail & Related papers (2022-12-15T12:47:59Z) - Mitigating Data Redundancy to Revitalize Transformer-based Long-Term Time Series Forecasting System [46.39662315849883]
We introduce CLMFormer, a novel framework that mitigates redundancy through curriculum learning and a memory-driven decoder.
CLMFormer consistently improves Transformer-based models by up to 30%, demonstrating its effectiveness in long-horizon forecasting.
arXiv Detail & Related papers (2022-07-16T04:05:15Z) - Autoformer: Decomposition Transformers with Auto-Correlation for
Long-Term Series Forecasting [68.86835407617778]
Autoformer is a novel decomposition architecture with an Auto-Correlation mechanism.
In long-term forecasting, Autoformer yields state-of-the-art accuracy, with a relative improvement on six benchmarks.
arXiv Detail & Related papers (2021-06-24T13:43:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.