Probabilistic Forecasting with Generative Networks via Scoring Rule
Minimization
- URL: http://arxiv.org/abs/2112.08217v3
- Date: Tue, 13 Feb 2024 15:27:28 GMT
- Title: Probabilistic Forecasting with Generative Networks via Scoring Rule
Minimization
- Authors: Lorenzo Pacchiardi, Rilwan Adewoyin, Peter Dueben, Ritabrata Dutta
- Abstract summary: We use generative neural networks to parametrize distributions on high-dimensional spaces by transforming draws from a latent variable.
We train generative networks to minimize a predictive-sequential (or prequential) scoring rule on a recorded temporal sequence of the phenomenon of interest.
Our method outperforms state-of-the-art adversarial approaches, especially in probabilistic calibration.
- Score: 5.5643498845134545
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Probabilistic forecasting relies on past observations to provide a
probability distribution for a future outcome, which is often evaluated against
the realization using a scoring rule. Here, we perform probabilistic
forecasting with generative neural networks, which parametrize distributions on
high-dimensional spaces by transforming draws from a latent variable.
Generative networks are typically trained in an adversarial framework. In
contrast, we propose to train generative networks to minimize a
predictive-sequential (or prequential) scoring rule on a recorded temporal
sequence of the phenomenon of interest, which is appealing as it corresponds to
the way forecasting systems are routinely evaluated. Adversarial-free
minimization is possible for some scoring rules; hence, our framework avoids
the cumbersome hyperparameter tuning and uncertainty underestimation due to
unstable adversarial training, thus unlocking reliable use of generative
networks in probabilistic forecasting. Further, we prove consistency of the
minimizer of our objective with dependent data, while adversarial training
assumes independence. We perform simulation studies on two chaotic dynamical
models and a benchmark data set of global weather observations; for this last
example, we define scoring rules for spatial data by drawing from the relevant
literature. Our method outperforms state-of-the-art adversarial approaches,
especially in probabilistic calibration, while requiring less hyperparameter
tuning.
Related papers
- Tackling Missing Values in Probabilistic Wind Power Forecasting: A
Generative Approach [1.384633930654651]
We propose treating missing values and forecasting targets indifferently and predicting all unknown values simultaneously.
Compared with the traditional "impute, then predict" pipeline, the proposed approach achieves better performance in terms of continuous ranked probability score.
arXiv Detail & Related papers (2024-03-06T11:38:08Z) - When Rigidity Hurts: Soft Consistency Regularization for Probabilistic
Hierarchical Time Series Forecasting [69.30930115236228]
Probabilistic hierarchical time-series forecasting is an important variant of time-series forecasting.
Most methods focus on point predictions and do not provide well-calibrated probabilistic forecasts distributions.
We propose PROFHiT, a fully probabilistic hierarchical forecasting model that jointly models forecast distribution of entire hierarchy.
arXiv Detail & Related papers (2023-10-17T20:30:16Z) - Creating Probabilistic Forecasts from Arbitrary Deterministic Forecasts
using Conditional Invertible Neural Networks [0.19573380763700712]
We use a conditional Invertible Neural Network (cINN) to learn the underlying distribution of the data and then combine the uncertainty from this distribution with an arbitrary deterministic forecast.
Our approach enables the simple creation of probabilistic forecasts without complicated statistical loss functions or further assumptions.
arXiv Detail & Related papers (2023-02-03T15:11:39Z) - When Rigidity Hurts: Soft Consistency Regularization for Probabilistic
Hierarchical Time Series Forecasting [69.30930115236228]
Probabilistic hierarchical time-series forecasting is an important variant of time-series forecasting.
Most methods focus on point predictions and do not provide well-calibrated probabilistic forecasts distributions.
We propose PROFHiT, a fully probabilistic hierarchical forecasting model that jointly models forecast distribution of entire hierarchy.
arXiv Detail & Related papers (2022-06-16T06:13:53Z) - Autoregressive Quantile Flows for Predictive Uncertainty Estimation [7.184701179854522]
We propose Autoregressive Quantile Flows, a flexible class of probabilistic models over high-dimensional variables.
These models are instances of autoregressive flows trained using a novel objective based on proper scoring rules.
arXiv Detail & Related papers (2021-12-09T01:11:26Z) - Probabilistic Time Series Forecasting with Implicit Quantile Networks [0.7249731529275341]
We combine an autoregressive recurrent neural network to model temporal dynamics with Implicit Quantile Networks to learn a large class of distributions over a time-series target.
Our approach is favorable in terms of point-wise prediction accuracy as well as on estimating the underlying temporal distribution.
arXiv Detail & Related papers (2021-07-08T10:37:24Z) - Quantifying Uncertainty in Deep Spatiotemporal Forecasting [67.77102283276409]
We describe two types of forecasting problems: regular grid-based and graph-based.
We analyze UQ methods from both the Bayesian and the frequentist point view, casting in a unified framework via statistical decision theory.
Through extensive experiments on real-world road network traffic, epidemics, and air quality forecasting tasks, we reveal the statistical computational trade-offs for different UQ methods.
arXiv Detail & Related papers (2021-05-25T14:35:46Z) - Interpretable Social Anchors for Human Trajectory Forecasting in Crowds [84.20437268671733]
We propose a neural network-based system to predict human trajectory in crowds.
We learn interpretable rule-based intents, and then utilise the expressibility of neural networks to model scene-specific residual.
Our architecture is tested on the interaction-centric benchmark TrajNet++.
arXiv Detail & Related papers (2021-05-07T09:22:34Z) - Improving Uncertainty Calibration via Prior Augmented Data [56.88185136509654]
Neural networks have proven successful at learning from complex data distributions by acting as universal function approximators.
They are often overconfident in their predictions, which leads to inaccurate and miscalibrated probabilistic predictions.
We propose a solution by seeking out regions of feature space where the model is unjustifiably overconfident, and conditionally raising the entropy of those predictions towards that of the prior distribution of the labels.
arXiv Detail & Related papers (2021-02-22T07:02:37Z) - Learning Interpretable Deep State Space Model for Probabilistic Time
Series Forecasting [98.57851612518758]
Probabilistic time series forecasting involves estimating the distribution of future based on its history.
We propose a deep state space model for probabilistic time series forecasting whereby the non-linear emission model and transition model are parameterized by networks.
We show in experiments that our model produces accurate and sharp probabilistic forecasts.
arXiv Detail & Related papers (2021-01-31T06:49:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.