Fast-Powerformer: A Memory-Efficient Transformer for Accurate Mid-Term Wind Power Forecasting
- URL: http://arxiv.org/abs/2504.10923v1
- Date: Tue, 15 Apr 2025 07:09:54 GMT
- Title: Fast-Powerformer: A Memory-Efficient Transformer for Accurate Mid-Term Wind Power Forecasting
- Authors: Mingyi Zhu, Zhaoxin Li, Qiao Lin, Li Ding,
- Abstract summary: Wind power forecasting plays a crucial role in enhancing the security, stability, and economic operation of power grids.<n>Due to the highity of meteorological factors (e.g., wind speed) and significant fluctuations in wind power output, mid-term wind power forecasting faces a dual challenge of maintaining high accuracy and computational efficiency.<n>This paper proposes an efficient and lightweight mid-term wind power forecasting model, termed Fast-Powerformer.
- Score: 4.2707347040807475
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Wind power forecasting (WPF), as a significant research topic within renewable energy, plays a crucial role in enhancing the security, stability, and economic operation of power grids. However, due to the high stochasticity of meteorological factors (e.g., wind speed) and significant fluctuations in wind power output, mid-term wind power forecasting faces a dual challenge of maintaining high accuracy and computational efficiency. To address these issues, this paper proposes an efficient and lightweight mid-term wind power forecasting model, termed Fast-Powerformer. The proposed model is built upon the Reformer architecture, incorporating structural enhancements such as a lightweight Long Short-Term Memory (LSTM) embedding module, an input transposition mechanism, and a Frequency Enhanced Channel Attention Mechanism (FECAM). These improvements enable the model to strengthen temporal feature extraction, optimize dependency modeling across variables, significantly reduce computational complexity, and enhance sensitivity to periodic patterns and dominant frequency components. Experimental results conducted on multiple real-world wind farm datasets demonstrate that the proposed Fast-Powerformer achieves superior prediction accuracy and operational efficiency compared to mainstream forecasting approaches. Furthermore, the model exhibits fast inference speed and low memory consumption, highlighting its considerable practical value for real-world deployment scenarios.
Related papers
- Powerformer: A Transformer with Weighted Causal Attention for Time-series Forecasting [50.298817606660826]
We introduce Powerformer, a novel Transformer variant that replaces noncausal attention weights with causal weights that are reweighted according to a smooth heavy-tailed decay.<n>Our empirical results demonstrate that Powerformer achieves state-of-the-art accuracy on public time-series benchmarks.<n>Our analyses show that the model's locality bias is amplified during training, demonstrating an interplay between time-series data and power-law-based attention.
arXiv Detail & Related papers (2025-02-10T04:42:11Z) - Enhanced Photovoltaic Power Forecasting: An iTransformer and LSTM-Based Model Integrating Temporal and Covariate Interactions [16.705621552594643]
Existing models often struggle with capturing the complex relationships between target variables and covariates.<n>We propose a novel model architecture that leverages the iTransformer for feature extraction from target variables.<n>A cross-attention mechanism is integrated to fuse the outputs of both models, followed by a Kolmogorov-Arnold network mapping.<n>Results demonstrate that the proposed model effectively capture seasonal variations in PV power generation and improve forecasting accuracy.
arXiv Detail & Related papers (2024-12-03T09:16:13Z) - Towards Stabilized and Efficient Diffusion Transformers through Long-Skip-Connections with Spectral Constraints [51.83081671798784]
Diffusion Transformers (DiT) have emerged as a powerful architecture for image and video generation, offering superior quality and scalability.<n>DiT's practical application suffers from inherent dynamic feature instability, leading to error amplification during cached inference.<n>We propose Skip-DiT, a novel DiT variant enhanced with Long-Skip-Connections (LSCs) - the key efficiency component in U-Nets.
arXiv Detail & Related papers (2024-11-26T17:28:10Z) - Hiformer: Hybrid Frequency Feature Enhancement Inverted Transformer for Long-Term Wind Power Prediction [6.936415534357298]
We propose a novel approach called Hybrid Frequency Feature Enhancement Inverted Transformer (Hiformer)
Hiformer integrates signal decomposition technology with weather feature extraction technique to enhance the modeling of correlations between meteorological conditions and wind power generation.
Compared to the state-of-the-art methods, Hiformer: (i) can improve the prediction accuracy by up to 52.5%; and (ii) can reduce computational time by up to 68.5%.
arXiv Detail & Related papers (2024-10-17T08:00:36Z) - Parsimony or Capability? Decomposition Delivers Both in Long-term Time Series Forecasting [46.63798583414426]
Long-term time series forecasting (LTSF) represents a critical frontier in time series analysis.
Our study demonstrates, through both analytical and empirical evidence, that decomposition is key to containing excessive model inflation.
Remarkably, by tailoring decomposition to the intrinsic dynamics of time series data, our proposed model outperforms existing benchmarks.
arXiv Detail & Related papers (2024-01-22T13:15:40Z) - Global Transformer Architecture for Indoor Room Temperature Forecasting [49.32130498861987]
This work presents a global Transformer architecture for indoor temperature forecasting in multi-room buildings.
It aims at optimizing energy consumption and reducing greenhouse gas emissions associated with HVAC systems.
Notably, this study is the first to apply a Transformer architecture for indoor temperature forecasting in multi-room buildings.
arXiv Detail & Related papers (2023-10-31T14:09:32Z) - Long-term Wind Power Forecasting with Hierarchical Spatial-Temporal
Transformer [112.12271800369741]
Wind power is attracting increasing attention around the world due to its renewable, pollution-free, and other advantages.
Accurate wind power forecasting (WPF) can effectively reduce power fluctuations in power system operations.
Existing methods are mainly designed for short-term predictions and lack effective spatial-temporal feature augmentation.
arXiv Detail & Related papers (2023-05-30T04:03:15Z) - Enhancing Short-Term Wind Speed Forecasting using Graph Attention and
Frequency-Enhanced Mechanisms [17.901334082943077]
GFST-WSF comprises a Transformer architecture for temporal feature extraction and a Graph Attention Network (GAT) for spatial feature extraction.
GAT is specifically designed to capture the complex spatial dependencies among wind speed stations.
Model time lag in wind speed correlation between adjacent wind farms caused by geographical factors.
arXiv Detail & Related papers (2023-05-19T08:50:58Z) - A novel automatic wind power prediction framework based on multi-time
scale and temporal attention mechanisms [6.120692237856329]
Wind power generation is characterized by volatility, intermittence, and randomness.
Traditional wind power forecasting systems primarily focus on ultra-short-term or short-term forecasts.
We propose an automatic framework capable of forecasting wind power across multi-time scale.
arXiv Detail & Related papers (2023-02-02T17:03:08Z) - Towards Long-Term Time-Series Forecasting: Feature, Pattern, and
Distribution [57.71199089609161]
Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning.
Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism.
We propose an efficient Transformerbased model, named Conformer, which differentiates itself from existing methods for LTTF in three aspects.
arXiv Detail & Related papers (2023-01-05T13:59:29Z) - Augmented Shortcuts for Vision Transformers [49.70151144700589]
We study the relationship between shortcuts and feature diversity in vision transformer models.
We present an augmented shortcut scheme, which inserts additional paths with learnable parameters in parallel on the original shortcuts.
Experiments conducted on benchmark datasets demonstrate the effectiveness of the proposed method.
arXiv Detail & Related papers (2021-06-30T09:48:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.