Benchmarking Pre-Trained Time Series Models for Electricity Price Forecasting
- URL: http://arxiv.org/abs/2506.08113v2
- Date: Wed, 20 Aug 2025 07:59:08 GMT
- Title: Benchmarking Pre-Trained Time Series Models for Electricity Price Forecasting
- Authors: Timothée Hornek Amir Sartipi, Igor Tchappi, Gilbert Fridgen,
- Abstract summary: We benchmark several state-of-the-art pretrained models against established statistical and machine learning (ML) methods for electricity price forecasting.<n>Using 2024 day-ahead auction (DAA) electricity prices from Germany, France, the Netherlands, Austria, and Belgium, we generate daily forecasts with a one-day horizon.<n>Chronos-Bolt and Time-MoE emerge as the strongest among the TSFMs, performing on par with traditional models.
- Score: 1.1557852082644071
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Accurate electricity price forecasting (EPF) is crucial for effective decision-making in power trading on the spot market. While recent advances in generative artificial intelligence (GenAI) and pre-trained large language models (LLMs) have inspired the development of numerous time series foundation models (TSFMs) for time series forecasting, their effectiveness in EPF remains uncertain. To address this gap, we benchmark several state-of-the-art pretrained models--Chronos-Bolt, Chronos-T5, TimesFM, Moirai, Time-MoE, and TimeGPT--against established statistical and machine learning (ML) methods for EPF. Using 2024 day-ahead auction (DAA) electricity prices from Germany, France, the Netherlands, Austria, and Belgium, we generate daily forecasts with a one-day horizon. Chronos-Bolt and Time-MoE emerge as the strongest among the TSFMs, performing on par with traditional models. However, the biseasonal MSTL model, which captures daily and weekly seasonality, stands out for its consistent performance across countries and evaluation metrics, with no TSFM statistically outperforming it.
Related papers
- Day-Ahead Electricity Price Forecasting for Volatile Markets Using Foundation Models with Regularization Strategy [2.6110235246137328]
Time series foundation models (TSFMs) have shown strong performance in general time series forecasting tasks.<n>This paper presents a spike regularization strategy and evaluates a wide range of TSFMs against traditional statistical and DL models.<n>Results demonstrate that TSFMs consistently outperform traditional approaches, achieving up to 37.4% improvement in MAPE.
arXiv Detail & Related papers (2026-02-05T08:20:50Z) - Multi-Horizon Electricity Price Forecasting with Deep Learning in the Australian National Electricity Market [2.2951895147679298]
We propose a novel electricity price forecasting (EPF) framework that extends the forecast horizon to multi-day-ahead.<n>Standard DL models deliver superior performance in most regions, while SOTA time series DL models demonstrate greater robustness to forecast horizon extension.
arXiv Detail & Related papers (2026-02-01T11:08:40Z) - Synapse: Adaptive Arbitration of Complementary Expertise in Time Series Foundational Models [50.877082340479085]
We study how different Time Series Foundational Models (TSFMs) exhibit specialized performance profiles across various forecasting settings.<n>We propose Synapse, a novel arbitration framework for TSFMs.<n>Results demonstrate that Synapse consistently outperforms other popular ensembling techniques as well as individual TSFMs.
arXiv Detail & Related papers (2025-11-07T18:01:51Z) - SEMPO: Lightweight Foundation Models for Time Series Forecasting [45.456949943052116]
SEMPO is a lightweight foundation model that requires pretraining on relatively small-scale data, yet exhibits strong general time series forecasting.<n> SEMPO comprises two key modules: 1) energy-aware SpEctral decomposition module, that substantially improves the utilization of pre-training data.<n>Experiments on two large-scale benchmarks covering 16 datasets demonstrate the superior performance of SEMPO in both zero-shot and few-shot forecasting scenarios.
arXiv Detail & Related papers (2025-10-22T15:58:44Z) - MoFE-Time: Mixture of Frequency Domain Experts for Time-Series Forecasting Models [11.374098795890738]
MoFE-Time integrates time and frequency domain features within a Mixture of Experts (MoE) network.<n>MoFE-Time has achieved new state-of-the-art performance, reducing MSE and MAE by 6.95% and 6.02% compared to the representative methods Time-MoE.<n>Our method achieves outstanding results on this dataset, underscoring the effectiveness of the MoFE-Time model in practical commercial applications.
arXiv Detail & Related papers (2025-07-09T03:00:56Z) - Timeseries Foundation Models for Mobility: A Benchmark Comparison with Traditional and Deep Learning Models [0.0]
This study evaluates the performance of TimeGPT compared to traditional approaches for predicting city-wide mobility timeseries.<n>Results highlight the potential of foundation models for mobility forecasting while also identifying limitations of our experiments.
arXiv Detail & Related papers (2025-03-31T07:20:31Z) - Powerformer: A Transformer with Weighted Causal Attention for Time-series Forecasting [50.298817606660826]
We introduce Powerformer, a novel Transformer variant that replaces noncausal attention weights with causal weights that are reweighted according to a smooth heavy-tailed decay.<n>Our empirical results demonstrate that Powerformer achieves state-of-the-art accuracy on public time-series benchmarks.<n>Our analyses show that the model's locality bias is amplified during training, demonstrating an interplay between time-series data and power-law-based attention.
arXiv Detail & Related papers (2025-02-10T04:42:11Z) - Financial Fine-tuning a Large Time Series Model [1.2894076331861153]
We evaluate the performance of the latest time series foundation model TimesFM on price prediction.<n>We find that due to the irregular nature of price data, directly applying TimesFM gives unsatisfactory results.<n>We propose to fine-tune TimeFM on financial data for the task of price prediction.
arXiv Detail & Related papers (2024-12-13T05:51:00Z) - FlowTS: Time Series Generation via Rectified Flow [67.41208519939626]
FlowTS is an ODE-based model that leverages rectified flow with straight-line transport in probability space.<n>For unconditional setting, FlowTS achieves state-of-the-art performance, with context FID scores of 0.019 and 0.011 on Stock and ETTh datasets.<n>For conditional setting, we have achieved superior performance in solar forecasting.
arXiv Detail & Related papers (2024-11-12T03:03:23Z) - BreakGPT: Leveraging Large Language Models for Predicting Asset Price Surges [55.2480439325792]
This paper introduces BreakGPT, a novel large language model (LLM) architecture adapted specifically for time series forecasting and the prediction of sharp upward movements in asset prices.
We showcase BreakGPT as a promising solution for financial forecasting with minimal training and as a strong competitor for capturing both local and global temporal dependencies.
arXiv Detail & Related papers (2024-11-09T05:40:32Z) - ViTime: Foundation Model for Time Series Forecasting Powered by Vision Intelligence [49.60944381032587]
Time series forecasting (TSF) possesses great practical values in various fields, including power and energy, transportation, etc.<n>TSF models have long been known to be problem-specific and lacking application generalizability.<n>This paper proposes a vision intelligence-powered framework, ViTime, for the first time.
arXiv Detail & Related papers (2024-07-10T02:11:01Z) - Efficient mid-term forecasting of hourly electricity load using generalized additive models [0.0]
We propose a novel forecasting method using Generalized Additive Models (GAMs) built from interpretable P-splines.<n>The proposed model is evaluated using load data from 24 European countries over more than 9 years.
arXiv Detail & Related papers (2024-05-27T11:41:41Z) - Probabilistic Forecasting of Real-Time Electricity Market Signals via Interpretable Generative AI [41.99446024585741]
We present WIAE-GPF, a Weak Innovation AutoEncoder-based Generative Probabilistic Forecasting architecture.
A novel learning algorithm with structural convergence guarantees is proposed, ensuring that the generated forecast samples match the ground truth conditional probability distribution.
arXiv Detail & Related papers (2024-03-09T00:41:30Z) - ExtremeCast: Boosting Extreme Value Prediction for Global Weather Forecast [57.6987191099507]
We introduce Exloss, a novel loss function that performs asymmetric optimization and highlights extreme values to obtain accurate extreme weather forecast.
We also introduce ExBooster, which captures the uncertainty in prediction outcomes by employing multiple random samples.
Our solution can achieve state-of-the-art performance in extreme weather prediction, while maintaining the overall forecast accuracy comparable to the top medium-range forecast models.
arXiv Detail & Related papers (2024-02-02T10:34:13Z) - Probabilistic Forecasting of Day-Ahead Electricity Prices and their
Volatility with LSTMs [0.0]
We present a Long Short-Term Memory (LSTM) model for the German-Luxembourg day-ahead electricity prices.
The recurrent structure of the LSTM allows the model to adapt to trends, while the joint prediction of both mean and standard deviation enables a probabilistic prediction.
Using a physics-inspired approach - superstatistics - to derive an explanation for the statistics of prices, we show that the LSTM model faithfully reproduces both prices and their volatility.
arXiv Detail & Related papers (2023-10-05T06:47:28Z) - Your Autoregressive Generative Model Can be Better If You Treat It as an
Energy-Based One [83.5162421521224]
We propose a unique method termed E-ARM for training autoregressive generative models.
E-ARM takes advantage of a well-designed energy-based learning objective.
We show that E-ARM can be trained efficiently and is capable of alleviating the exposure bias problem.
arXiv Detail & Related papers (2022-06-26T10:58:41Z) - A Hybrid Model for Forecasting Short-Term Electricity Demand [59.372588316558826]
Currently the UK Electric market is guided by load (demand) forecasts published every thirty minutes by the regulator.
We present HYENA: a hybrid predictive model that combines feature engineering (selection of the candidate predictor features), mobile-window predictors and LSTM encoder-decoders.
arXiv Detail & Related papers (2022-05-20T22:13:25Z) - No MCMC for me: Amortized sampling for fast and stable training of
energy-based models [62.1234885852552]
Energy-Based Models (EBMs) present a flexible and appealing way to represent uncertainty.
We present a simple method for training EBMs at scale using an entropy-regularized generator to amortize the MCMC sampling.
Next, we apply our estimator to the recently proposed Joint Energy Model (JEM), where we match the original performance with faster and stable training.
arXiv Detail & Related papers (2020-10-08T19:17:20Z) - Ensemble Forecasting for Intraday Electricity Prices: Simulating
Trajectories [0.0]
Recent studies have shown that the hourly German Intraday Continuous Market is weak-form efficient.
A probabilistic forecasting of the hourly intraday electricity prices is performed by simulating trajectories in every trading window.
The study aims to forecast the price distribution in the German Intraday Continuous Market in the last 3 hours of trading, but the approach allows for application to other continuous markets, especially in Europe.
arXiv Detail & Related papers (2020-05-04T10:21:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.