Deep Autoregressive Models with Spectral Attention
- URL: http://arxiv.org/abs/2107.05984v1
- Date: Tue, 13 Jul 2021 11:08:47 GMT
- Title: Deep Autoregressive Models with Spectral Attention
- Authors: Fernando Moreno-Pino, Pablo M. Olmos and Antonio Art\'es-Rodr\'iguez
- Abstract summary: We propose a forecasting architecture that combines deep autoregressive models with a Spectral Attention (SA) module.
By characterizing in the spectral domain the embedding of the time series as occurrences of a random process, our method can identify global trends and seasonality patterns.
Two spectral attention models, global and local to the time series, integrate this information within the forecast and perform spectral filtering to remove time series's noise.
- Score: 74.08846528440024
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Time series forecasting is an important problem across many domains, playing
a crucial role in multiple real-world applications. In this paper, we propose a
forecasting architecture that combines deep autoregressive models with a
Spectral Attention (SA) module, which merges global and local frequency domain
information in the model's embedded space. By characterizing in the spectral
domain the embedding of the time series as occurrences of a random process, our
method can identify global trends and seasonality patterns. Two spectral
attention models, global and local to the time series, integrate this
information within the forecast and perform spectral filtering to remove time
series's noise. The proposed architecture has a number of useful properties: it
can be effectively incorporated into well-know forecast architectures,
requiring a low number of parameters and producing interpretable results that
improve forecasting accuracy. We test the Spectral Attention Autoregressive
Model (SAAM) on several well-know forecast datasets, consistently demonstrating
that our model compares favorably to state-of-the-art approaches.
Related papers
- Moirai-MoE: Empowering Time Series Foundation Models with Sparse Mixture of Experts [103.725112190618]
This paper introduces Moirai-MoE, using a single input/output projection layer while delegating the modeling of diverse time series patterns to the sparse mixture of experts.
Extensive experiments on 39 datasets demonstrate the superiority of Moirai-MoE over existing foundation models in both in-distribution and zero-shot scenarios.
arXiv Detail & Related papers (2024-10-14T13:01:11Z) - Learning Pattern-Specific Experts for Time Series Forecasting Under Patch-level Distribution Shift [30.581736814767606]
Time series forecasting aims to predict future values based on historical data.
Real-world time often exhibit complex non-uniform distribution with varying patterns across segments, such as season, operating condition, or semantic meaning.
We propose bftextS, a novel architecture that leverages pattern-specific experts for more accurate and adaptable time series forecasting.
arXiv Detail & Related papers (2024-10-13T13:35:29Z) - Local Attention Mechanism: Boosting the Transformer Architecture for Long-Sequence Time Series Forecasting [8.841114905151152]
Local Attention Mechanism (LAM) is an efficient attention mechanism tailored for time series analysis.
LAM exploits the continuity properties of time series to reduce the number of attention scores computed.
We present an algorithm for implementing LAM in algebra tensor that runs in time and memory O(nlogn)
arXiv Detail & Related papers (2024-10-04T11:32:02Z) - SpectralEarth: Training Hyperspectral Foundation Models at Scale [47.93167977587301]
We introduce SpectralEarth, a large-scale multi-temporal dataset designed to pretrain hyperspectral foundation models.
We pretrain a series of foundation models on SpectralEarth using state-of-the-art self-supervised learning (SSL) algorithms.
We construct four downstream datasets for land-cover and crop-type mapping, providing benchmarks for model evaluation.
arXiv Detail & Related papers (2024-08-15T22:55:59Z) - TimeSieve: Extracting Temporal Dynamics through Information Bottlenecks [31.10683149519954]
We propose an innovative time series forecasting model TimeSieve.
Our approach employs wavelet transforms to preprocess time series data, effectively capturing multi-scale features.
Our results validate the effectiveness of our approach in addressing the key challenges in time series forecasting.
arXiv Detail & Related papers (2024-06-07T15:58:12Z) - SFANet: Spatial-Frequency Attention Network for Weather Forecasting [54.470205739015434]
Weather forecasting plays a critical role in various sectors, driving decision-making and risk management.
Traditional methods often struggle to capture the complex dynamics of meteorological systems.
We propose a novel framework designed to address these challenges and enhance the accuracy of weather prediction.
arXiv Detail & Related papers (2024-05-29T08:00:15Z) - RPMixer: Shaking Up Time Series Forecasting with Random Projections for Large Spatial-Temporal Data [33.0546525587517]
We propose a all-Multi-Layer Perceptron (all-MLP) time series forecasting architecture called RPMixer.
Our method capitalizes on the ensemble-like behavior of deep neural networks, where each individual block behaves like a base learner in an ensemble model.
arXiv Detail & Related papers (2024-02-16T07:28:59Z) - Unified Training of Universal Time Series Forecasting Transformers [104.56318980466742]
We present a Masked-based Universal Time Series Forecasting Transformer (Moirai)
Moirai is trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains.
Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models.
arXiv Detail & Related papers (2024-02-04T20:00:45Z) - Embedded feature selection in LSTM networks with multi-objective
evolutionary ensemble learning for time series forecasting [49.1574468325115]
We present a novel feature selection method embedded in Long Short-Term Memory networks.
Our approach optimize the weights and biases of the LSTM in a partitioned manner.
Experimental evaluations on air quality time series data from Italy and southeast Spain demonstrate that our method substantially improves the ability generalization of conventional LSTMs.
arXiv Detail & Related papers (2023-12-29T08:42:10Z) - Towards Spatio-Temporal Aware Traffic Time Series Forecasting--Full
Version [37.09531298150374]
Traffic series forecasting is challenging due to complex time series patterns for the same time series patterns may vary across time, where, for example, there exist periods across a day showing stronger temporal correlations.
Such-temporal models employ a shared parameter space irrespective of the time locations and the time periods and they assume that the temporal correlations are similar across locations and do not always hold across time which may not always be the case.
We propose a framework that aims at turning ICD-temporal aware models to encode sub-temporal models.
arXiv Detail & Related papers (2022-03-29T16:44:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.