NBMLSS: probabilistic forecasting of electricity prices via Neural Basis Models for Location Scale and Shape
- URL: http://arxiv.org/abs/2411.13921v1
- Date: Thu, 21 Nov 2024 08:17:53 GMT
- Title: NBMLSS: probabilistic forecasting of electricity prices via Neural Basis Models for Location Scale and Shape
- Authors: Alessandro Brusaferri, Danial Ramin, Andrea Ballarino,
- Abstract summary: We deploy a Neural Basis Model for Location, Scale and Shape, that blends the principled interpretability of GAMLSS with a computationally scalable shared basis decomposition.
Experiments have been conducted on multiple market regions, achieving probabilistic forecasting performance comparable to that of distributional neural networks.
- Score: 44.99833362998488
- License:
- Abstract: Forecasters using flexible neural networks (NN) in multi-horizon distributional regression setups often struggle to gain detailed insights into the underlying mechanisms that lead to the predicted feature-conditioned distribution parameters. In this work, we deploy a Neural Basis Model for Location, Scale and Shape, that blends the principled interpretability of GAMLSS with a computationally scalable shared basis decomposition, combined by linear projections supporting dedicated stepwise and parameter-wise feature shape functions aggregations. Experiments have been conducted on multiple market regions, achieving probabilistic forecasting performance comparable to that of distributional neural networks, while providing more insights into the model behavior through the learned nonlinear feature level maps to the distribution parameters across the prediction steps.
Related papers
- Nonuniform random feature models using derivative information [10.239175197655266]
We propose nonuniform data-driven parameter distributions for neural network initialization based on derivative data of the function to be approximated.
We address the cases of Heaviside and ReLU activation functions, and their smooth approximations (sigmoid and softplus)
We suggest simplifications of these exact densities based on approximate derivative data in the input points that allow for very efficient sampling and lead to performance of random feature models close to optimal networks in several scenarios.
arXiv Detail & Related papers (2024-10-03T01:30:13Z) - Towards Generalizable and Interpretable Motion Prediction: A Deep
Variational Bayes Approach [54.429396802848224]
This paper proposes an interpretable generative model for motion prediction with robust generalizability to out-of-distribution cases.
For interpretability, the model achieves the target-driven motion prediction by estimating the spatial distribution of long-term destinations.
Experiments on motion prediction datasets validate that the fitted model can be interpretable and generalizable.
arXiv Detail & Related papers (2024-03-10T04:16:04Z) - Function-Space Regularization in Neural Networks: A Probabilistic
Perspective [51.133793272222874]
We show that we can derive a well-motivated regularization technique that allows explicitly encoding information about desired predictive functions into neural network training.
We evaluate the utility of this regularization technique empirically and demonstrate that the proposed method leads to near-perfect semantic shift detection and highly-calibrated predictive uncertainty estimates.
arXiv Detail & Related papers (2023-12-28T17:50:56Z) - Structured Radial Basis Function Network: Modelling Diversity for
Multiple Hypotheses Prediction [51.82628081279621]
Multi-modal regression is important in forecasting nonstationary processes or with a complex mixture of distributions.
A Structured Radial Basis Function Network is presented as an ensemble of multiple hypotheses predictors for regression problems.
It is proved that this structured model can efficiently interpolate this tessellation and approximate the multiple hypotheses target distribution.
arXiv Detail & Related papers (2023-09-02T01:27:53Z) - Neural networks for geospatial data [0.0]
NN-GLS is a new neural network estimation algorithm for the non-linear mean in GP models.
We show that NN-GLS admits a representation as a special type of graph neural network (GNN)
Theoretically, we show that NN-GLS will be consistent for irregularly observed spatially correlated data processes.
arXiv Detail & Related papers (2023-04-18T17:52:23Z) - Variational EP with Probabilistic Backpropagation for Bayesian Neural
Networks [0.0]
I propose a novel approach for nonlinear Logistic regression using a two-layer neural network (NN) model structure with hierarchical priors on the network weights.
I derive a computationally efficient algorithm, whose complexity scales similarly to an ensemble of independent sparse logistic models.
arXiv Detail & Related papers (2023-03-02T19:09:47Z) - Deep equilibrium models as estimators for continuous latent variables [10.244213671349225]
We show explicit relationships between neural network architectures and statistical models.
We find that deep equilibrium models solve maximum a-posteriori (MAP) estimates for the latents and parameters of the transformation.
Our DEQ feature maps are end-to-end differentiable, enabling fine-tuning for downstream tasks.
arXiv Detail & Related papers (2022-11-11T01:21:34Z) - Correcting Model Bias with Sparse Implicit Processes [0.9187159782788579]
We show that Sparse Implicit Processes (SIP) is capable of correcting model bias when the data generating mechanism differs strongly from the one implied by the model.
We use synthetic datasets to show that SIP is capable of providing predictive distributions that reflect the data better than the exact predictions of the initial, but wrongly assumed model.
arXiv Detail & Related papers (2022-07-21T18:00:01Z) - Probabilistic electric load forecasting through Bayesian Mixture Density
Networks [70.50488907591463]
Probabilistic load forecasting (PLF) is a key component in the extended tool-chain required for efficient management of smart energy grids.
We propose a novel PLF approach, framed on Bayesian Mixture Density Networks.
To achieve reliable and computationally scalable estimators of the posterior distributions, both Mean Field variational inference and deep ensembles are integrated.
arXiv Detail & Related papers (2020-12-23T16:21:34Z) - Provably Efficient Neural Estimation of Structural Equation Model: An
Adversarial Approach [144.21892195917758]
We study estimation in a class of generalized Structural equation models (SEMs)
We formulate the linear operator equation as a min-max game, where both players are parameterized by neural networks (NNs), and learn the parameters of these neural networks using a gradient descent.
For the first time we provide a tractable estimation procedure for SEMs based on NNs with provable convergence and without the need for sample splitting.
arXiv Detail & Related papers (2020-07-02T17:55:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.