State-Space Models Win the IEEE DataPort Competition on Post-covid
Day-ahead Electricity Load Forecasting
- URL: http://arxiv.org/abs/2110.00334v1
- Date: Fri, 1 Oct 2021 11:57:37 GMT
- Title: State-Space Models Win the IEEE DataPort Competition on Post-covid
Day-ahead Electricity Load Forecasting
- Authors: Joseph de Vilmarest (LPSM, EDF R&D OSIRIS), Yannig Goude (LMO, EDF R&D
OSIRIS)
- Abstract summary: We present the winning strategy of an electricity demand forecasting competition.
This competition was organized to design new forecasting methods for unstable periods such as the one starting in Spring 2020.
We rely on state-space models to adapt standard statistical and machine learning models.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present the winning strategy of an electricity demand forecasting
competition. This competition was organized to design new forecasting methods
for unstable periods such as the one starting in Spring 2020. We rely on
state-space models to adapt standard statistical and machine learning models.
We claim that it achieves the right compromise between two extremes. On the one
hand, purely time-series models such as autoregressives are adaptive in essence
but fail to capture dependence to exogenous variables. On the other hand,
machine learning methods allow to learn complex dependence to explanatory
variables on a historical data set but fail to forecast non-stationary data
accurately. The evaluation period of the competition was the occasion of trial
and error and we put the focus on the final forecasting procedure. In
particular, it was at the same time that a recent algorithm was designed to
adapt the variances of a state-space model and we present the results of the
final version only. We discuss day-today predictions nonetheless.
Related papers
- On conditional diffusion models for PDE simulations [53.01911265639582]
We study score-based diffusion models for forecasting and assimilation of sparse observations.
We propose an autoregressive sampling approach that significantly improves performance in forecasting.
We also propose a new training strategy for conditional score-based models that achieves stable performance over a range of history lengths.
arXiv Detail & Related papers (2024-10-21T18:31:04Z) - Learning Augmentation Policies from A Model Zoo for Time Series Forecasting [58.66211334969299]
We introduce AutoTSAug, a learnable data augmentation method based on reinforcement learning.
By augmenting the marginal samples with a learnable policy, AutoTSAug substantially improves forecasting performance.
arXiv Detail & Related papers (2024-09-10T07:34:19Z) - A non-intrusive machine learning framework for debiasing long-time
coarse resolution climate simulations and quantifying rare events statistics [0.0]
coarse models suffer from inherent bias due to the ignored "sub-grid" scales.
We propose a framework to non-intrusively debias coarse-resolution climate predictions using neural-network (NN) correction operators.
arXiv Detail & Related papers (2024-02-28T17:06:19Z) - Attention-Based Ensemble Pooling for Time Series Forecasting [55.2480439325792]
We propose a method for pooling that performs a weighted average over candidate model forecasts.
We test this method on two time-series forecasting problems: multi-step forecasting of the dynamics of the non-stationary Lorenz 63 equation, and one-step forecasting of the weekly incident deaths due to COVID-19.
arXiv Detail & Related papers (2023-10-24T22:59:56Z) - Learning Sample Difficulty from Pre-trained Models for Reliable
Prediction [55.77136037458667]
We propose to utilize large-scale pre-trained models to guide downstream model training with sample difficulty-aware entropy regularization.
We simultaneously improve accuracy and uncertainty calibration across challenging benchmarks.
arXiv Detail & Related papers (2023-04-20T07:29:23Z) - Frugal day-ahead forecasting of multiple local electricity loads by
aggregating adaptive models [0.0]
We focus on day-ahead electricity load forecasting of substations of the distribution network in France.
We develop a frugal variant, reducing the number of parameters estimated, to achieve transfer learning.
We highlight the interpretability of the models, which is important for operational applications.
arXiv Detail & Related papers (2023-02-16T10:17:19Z) - Conformal prediction for the design problem [72.14982816083297]
In many real-world deployments of machine learning, we use a prediction algorithm to choose what data to test next.
In such settings, there is a distinct type of distribution shift between the training and test data.
We introduce a method to quantify predictive uncertainty in such settings.
arXiv Detail & Related papers (2022-02-08T02:59:12Z) - Learning Interpretable Deep State Space Model for Probabilistic Time
Series Forecasting [98.57851612518758]
Probabilistic time series forecasting involves estimating the distribution of future based on its history.
We propose a deep state space model for probabilistic time series forecasting whereby the non-linear emission model and transition model are parameterized by networks.
We show in experiments that our model produces accurate and sharp probabilistic forecasts.
arXiv Detail & Related papers (2021-01-31T06:49:33Z) - Learning Prediction Intervals for Model Performance [1.433758865948252]
We propose a method to compute prediction intervals for model performance.
We evaluate our approach across a wide range of drift conditions and show substantial improvement over competitive baselines.
arXiv Detail & Related papers (2020-12-15T21:32:03Z) - Robust Validation: Confident Predictions Even When Distributions Shift [19.327409270934474]
We describe procedures for robust predictive inference, where a model provides uncertainty estimates on its predictions rather than point predictions.
We present a method that produces prediction sets (almost exactly) giving the right coverage level for any test distribution in an $f$-divergence ball around the training population.
An essential component of our methodology is to estimate the amount of expected future data shift and build robustness to it.
arXiv Detail & Related papers (2020-08-10T17:09:16Z) - Drift-Adjusted And Arbitrated Ensemble Framework For Time Series
Forecasting [0.491574468325115]
Time series forecasting is a challenging problem due to complex and evolving nature of time series data.
No one method is universally effective for all kinds of time series data.
We propose a re-weighting based method to adjust the assigned weights to various forecasters in order to account for such distribution-drift.
arXiv Detail & Related papers (2020-03-16T10:21:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.