Spatio-Temporal Wind Speed Forecasting using Graph Networks and Novel
Transformer Architectures
- URL: http://arxiv.org/abs/2208.13585v1
- Date: Mon, 29 Aug 2022 13:26:20 GMT
- Title: Spatio-Temporal Wind Speed Forecasting using Graph Networks and Novel
Transformer Architectures
- Authors: Lars {\O}degaard Bentsen, Narada Dilp Warakagoda, Roy Stenbro, Paal
Engelstad
- Abstract summary: This study focuses on multi-step-temporal wind speed forecasting for the Norwegian continental shelf.
A graph neural network (GNN) architecture was used to extract spatial dependencies, with different update functions to learn temporal correlations.
This is the first time the LogSparse Transformer and Autoformer have been applied to wind forecasting.
- Score: 1.278093617645299
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: To improve the security and reliability of wind energy production, short-term
forecasting has become of utmost importance. This study focuses on multi-step
spatio-temporal wind speed forecasting for the Norwegian continental shelf. A
graph neural network (GNN) architecture was used to extract spatial
dependencies, with different update functions to learn temporal correlations.
These update functions were implemented using different neural network
architectures. One such architecture, the Transformer, has become increasingly
popular for sequence modelling in recent years. Various alterations of the
original architecture have been proposed to better facilitate time-series
forecasting, of which this study focused on the Informer, LogSparse Transformer
and Autoformer. This is the first time the LogSparse Transformer and Autoformer
have been applied to wind forecasting and the first time any of these or the
Informer have been formulated in a spatio-temporal setting for wind
forecasting. By comparing against spatio-temporal Long Short-Term Memory (LSTM)
and Multi-Layer Perceptron (MLP) models, the study showed that the models using
the altered Transformer architectures as update functions in GNNs were able to
outperform these. Furthermore, we propose the Fast Fourier Transformer
(FFTransformer), which is a novel Transformer architecture based on signal
decomposition and consists of two separate streams that analyse trend and
periodic components separately. The FFTransformer and Autoformer were found to
achieve superior results for the 10-minute and 1-hour ahead forecasts, with the
FFTransformer significantly outperforming all other models for the 4-hour ahead
forecasts. Finally, by varying the degree of connectivity for the graph
representations, the study explicitly demonstrates how all models were able to
leverage spatial dependencies to improve local short-term wind speed
forecasting.
Related papers
- PRformer: Pyramidal Recurrent Transformer for Multivariate Time Series Forecasting [82.03373838627606]
Self-attention mechanism in Transformer architecture requires positional embeddings to encode temporal order in time series prediction.
We argue that this reliance on positional embeddings restricts the Transformer's ability to effectively represent temporal sequences.
We present a model integrating PRE with a standard Transformer encoder, demonstrating state-of-the-art performance on various real-world datasets.
arXiv Detail & Related papers (2024-08-20T01:56:07Z) - Are Self-Attentions Effective for Time Series Forecasting? [4.990206466948269]
Time series forecasting is crucial for applications across multiple domains and various scenarios.
Recent findings have indicated that simpler linear models might outperform complex Transformer-based approaches.
We introduce a new architecture, Cross-Attention-only Time Series transformer (CATS)
Our model achieves superior performance with the lowest mean squared error and uses fewer parameters compared to existing models.
arXiv Detail & Related papers (2024-05-27T06:49:39Z) - FourierGNN: Rethinking Multivariate Time Series Forecasting from a Pure
Graph Perspective [48.00240550685946]
Current state-of-the-art graph neural network (GNN)-based forecasting methods usually require both graph networks (e.g., GCN) and temporal networks (e.g., LSTM) to capture inter-series (spatial) dynamics and intra-series (temporal) dependencies, respectively.
We propose a novel Fourier Graph Neural Network (FourierGNN) by stacking our proposed Fourier Graph Operator (FGO) to perform matrix multiplications in Fourier space.
Our experiments on seven datasets have demonstrated superior performance with higher efficiency and fewer parameters compared with state-of-the-
arXiv Detail & Related papers (2023-11-10T17:13:26Z) - iTransformer: Inverted Transformers Are Effective for Time Series Forecasting [62.40166958002558]
We propose iTransformer, which simply applies the attention and feed-forward network on the inverted dimensions.
The iTransformer model achieves state-of-the-art on challenging real-world datasets.
arXiv Detail & Related papers (2023-10-10T13:44:09Z) - Long-term Wind Power Forecasting with Hierarchical Spatial-Temporal
Transformer [112.12271800369741]
Wind power is attracting increasing attention around the world due to its renewable, pollution-free, and other advantages.
Accurate wind power forecasting (WPF) can effectively reduce power fluctuations in power system operations.
Existing methods are mainly designed for short-term predictions and lack effective spatial-temporal feature augmentation.
arXiv Detail & Related papers (2023-05-30T04:03:15Z) - CARD: Channel Aligned Robust Blend Transformer for Time Series
Forecasting [50.23240107430597]
We design a special Transformer, i.e., Channel Aligned Robust Blend Transformer (CARD for short), that addresses key shortcomings of CI type Transformer in time series forecasting.
First, CARD introduces a channel-aligned attention structure that allows it to capture both temporal correlations among signals.
Second, in order to efficiently utilize the multi-scale knowledge, we design a token blend module to generate tokens with different resolutions.
Third, we introduce a robust loss function for time series forecasting to alleviate the potential overfitting issue.
arXiv Detail & Related papers (2023-05-20T05:16:31Z) - Towards Long-Term Time-Series Forecasting: Feature, Pattern, and
Distribution [57.71199089609161]
Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning.
Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism.
We propose an efficient Transformerbased model, named Conformer, which differentiates itself from existing methods for LTTF in three aspects.
arXiv Detail & Related papers (2023-01-05T13:59:29Z) - Multi-Step Short-Term Wind Speed Prediction with Rank Pooling and Fast
Fourier Transformation [0.0]
Short-term wind speed prediction is essential for economical wind power utilization.
The real-world wind speed data is typically intermittent and fluctuating, presenting great challenges to existing shallow models.
We present a novel deep hybrid model for multi-step wind speed prediction, namely LR-FFT-RP-MLP/LSTM.
arXiv Detail & Related papers (2022-11-23T14:02:52Z) - Are Transformers Effective for Time Series Forecasting? [13.268196448051308]
Recently, there has been a surge of Transformer-based solutions for the time series forecasting (TSF) task.
This study investigates whether Transformer-based techniques are the right solutions for long-term time series forecasting.
We find that the relatively higher long-term forecasting accuracy of Transformer-based solutions has little to do with the temporal relation extraction capabilities of the Transformer architecture.
arXiv Detail & Related papers (2022-05-26T17:17:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.