Conditional Forecasts and Proper Scoring Rules for Reliable and Accurate Performative Predictions
- URL: http://arxiv.org/abs/2510.21335v1
- Date: Fri, 24 Oct 2025 10:59:21 GMT
- Title: Conditional Forecasts and Proper Scoring Rules for Reliable and Accurate Performative Predictions
- Authors: Philip Boeken, Onno Zoeter, Joris M. Mooij,
- Abstract summary: We show that conditioning forecasts on covariables that separate them from the outcome renders the target distribution forecast-invariant.<n>We identify two solutions: (i) in decision-theoretic settings, elicitation of correct and incentive-compatible forecasts is possible if forecasts are separating; (ii) scoring with unbiased estimates of the divergence between the forecast and the induced distribution of the target variable yields correct forecasts.<n>Our results expose fundamental limits of classical forecast evaluation and offer new tools for reliable and accurate forecasting in performative settings.
- Score: 1.1087735229999816
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Performative predictions are forecasts which influence the outcomes they aim to predict, undermining the existence of correct forecasts and standard methods of elicitation and estimation. We show that conditioning forecasts on covariates that separate them from the outcome renders the target distribution forecast-invariant, guaranteeing well-posedness of the forecasting problem. However, even under this condition, classical proper scoring rules fail to elicit correct forecasts. We prove a general impossibility result and identify two solutions: (i) in decision-theoretic settings, elicitation of correct and incentive-compatible forecasts is possible if forecasts are separating; (ii) scoring with unbiased estimates of the divergence between the forecast and the induced distribution of the target variable yields correct forecasts. Applying these insights to parameter estimation, conditional forecasts and proper scoring rules enable performatively stable estimation of performatively correct parameters, resolving the issues raised by Perdomo et al. (2020). Our results expose fundamental limits of classical forecast evaluation and offer new tools for reliable and accurate forecasting in performative settings.
Related papers
- Distribution-informed Online Conformal Prediction [53.674678995825666]
We propose Conformal Optimistic Prediction (COP), an online conformal prediction algorithm incorporating underlying data pattern into the update rule.<n>COP produces tighter prediction sets when predictable pattern exists, while retaining valid coverage guarantees even when estimates are inaccurate.<n>We prove that COP can achieve valid coverage and construct shorter prediction intervals than other baselines.
arXiv Detail & Related papers (2025-12-08T17:51:49Z) - Probabilistic bias adjustment of seasonal predictions of Arctic Sea Ice Concentration [0.0]
Seasonal prediction systems often show biases and forecast complex-temporal errors.<n>We introduce a probabilistic error correction framework based on a conditional Variational Autoencoder model.<n>We show that the adjusted forecasts are better calibrated to the observational distribution, and have smaller errors than climatological mean adjusted forecasts.
arXiv Detail & Related papers (2025-10-10T22:17:29Z) - Truthful Elicitation of Imprecise Forecasts [11.153198087930756]
We propose a framework for scoring imprecise forecasts -- forecasts given as a set of beliefs.<n>We show that truthful elicitation of imprecise forecasts is achievable using proper scoring rules randomized over the aggregation procedure.
arXiv Detail & Related papers (2025-03-20T17:53:35Z) - Conformal Prediction Sets with Improved Conditional Coverage using Trust Scores [52.92618442300405]
It is impossible to achieve exact, distribution-free conditional coverage in finite samples.<n>We propose an alternative conformal prediction algorithm that targets coverage where it matters most.
arXiv Detail & Related papers (2025-01-17T12:01:56Z) - Consistency Checks for Language Model Forecasters [54.62507816753479]
We measure the performance of forecasters in terms of the consistency of their predictions on different logically-related questions.<n>We build an automated evaluation system that generates a set of base questions, instantiates consistency checks from these questions, elicits predictions of the forecaster, and measures the consistency of the predictions.
arXiv Detail & Related papers (2024-12-24T16:51:35Z) - Hybrid Forecasting of Geopolitical Events [71.73737011120103]
SAGE is a hybrid forecasting system that combines human and machine generated forecasts.<n>The system aggregates human and machine forecasts weighting both for propinquity and based on assessed skill.<n>We show that skilled forecasters who had access to machine-generated forecasts outperformed those who only viewed historical data.
arXiv Detail & Related papers (2024-12-14T22:09:45Z) - Bin-Conditional Conformal Prediction of Fatalities from Armed Conflict [0.5312303275762104]
We introduce bin-conditional conformal prediction (BCCP), which enhances standard conformal prediction by ensuring consistent coverage rates across user-defined subsets.<n>Compared to standard conformal prediction, BCCP offers improved local coverage, though this comes at the cost of slightly wider prediction intervals.
arXiv Detail & Related papers (2024-10-18T14:41:42Z) - Self-Calibrating Conformal Prediction [16.606421967131524]
We introduce Self-Calibrating Conformal Prediction to deliver calibrated point predictions alongside prediction intervals with finite-sample validity conditional on these predictions.
We show that our method improves calibrated interval efficiency through model calibration and offers a practical alternative to feature-conditional validity.
arXiv Detail & Related papers (2024-02-11T21:12:21Z) - Predictive Inference with Feature Conformal Prediction [80.77443423828315]
We propose feature conformal prediction, which extends the scope of conformal prediction to semantic feature spaces.
From a theoretical perspective, we demonstrate that feature conformal prediction provably outperforms regular conformal prediction under mild assumptions.
Our approach could be combined with not only vanilla conformal prediction, but also other adaptive conformal prediction methods.
arXiv Detail & Related papers (2022-10-01T02:57:37Z) - Uncertainty estimation of pedestrian future trajectory using Bayesian
approximation [137.00426219455116]
Under dynamic traffic scenarios, planning based on deterministic predictions is not trustworthy.
The authors propose to quantify uncertainty during forecasting using approximation which deterministic approaches fail to capture.
The effect of dropout weights and long-term prediction on future state uncertainty has been studied.
arXiv Detail & Related papers (2022-05-04T04:23:38Z) - Optimal reconciliation with immutable forecasts [9.25906680708985]
We formulate reconciliation methodology that keeps forecasts of a pre-specified subset of variables unchanged or "immutable"
We prove that our approach preserves unbiasedness in base forecasts.
Our method can also account for correlations between base forecasting errors and ensure non-negativity of forecasts.
arXiv Detail & Related papers (2022-04-20T05:23:31Z) - Evaluation of Machine Learning Techniques for Forecast Uncertainty
Quantification [0.13999481573773068]
Ensemble forecasting is, so far, the most successful approach to produce relevant forecasts along with an estimation of their uncertainty.
Main limitations of ensemble forecasting are the high computational cost and the difficulty to capture and quantify different sources of uncertainty.
In this work proof-of-concept model experiments are conducted to examine the performance of ANNs trained to predict a corrected state of the system and the state uncertainty using only a single deterministic forecast as input.
arXiv Detail & Related papers (2021-11-29T16:52:17Z) - CovarianceNet: Conditional Generative Model for Correct Covariance
Prediction in Human Motion Prediction [71.31516599226606]
We present a new method to correctly predict the uncertainty associated with the predicted distribution of future trajectories.
Our approach, CovariaceNet, is based on a Conditional Generative Model with Gaussian latent variables.
arXiv Detail & Related papers (2021-09-07T09:38:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.