Calibrated Multivariate Distributional Regression with Pre-Rank Regularization
- URL: http://arxiv.org/abs/2601.22895v1
- Date: Fri, 30 Jan 2026 12:13:47 GMT
- Title: Calibrated Multivariate Distributional Regression with Pre-Rank Regularization
- Authors: Aya Laajil, Elnura Zhalieva, Naomi Desobry, Souhaib Ben Taieb,
- Abstract summary: We propose a regularization-based calibration method that enforces multivariate calibration during training of regression models.<n>We introduce a novel PCA-based pre-rank that projects predictions onto principal directions of the predictive distribution.
- Score: 3.721528851694675
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The goal of probabilistic prediction is to issue predictive distributions that are as informative as possible, subject to being calibrated. Despite substantial progress in the univariate setting, achieving multivariate calibration remains challenging. Recent work has introduced pre-rank functions, scalar projections of multivariate forecasts and observations, as flexible diagnostics for assessing specific aspects of multivariate calibration, but their use has largely been limited to post-hoc evaluation. We propose a regularization-based calibration method that enforces multivariate calibration during training of multivariate distributional regression models using pre-rank functions. We further introduce a novel PCA-based pre-rank that projects predictions onto principal directions of the predictive distribution. Through simulation studies and experiments on 18 real-world multi-output regression datasets, we show that the proposed approach substantially improves multivariate pre-rank calibration without compromising predictive accuracy, and that the PCA pre-rank reveals dependence-structure misspecifications that are not detected by existing pre-ranks.
Related papers
- Enforcing Calibration in Multi-Output Probabilistic Regression with Pre-rank Regularization [4.065502917666599]
We introduce a general regularization framework to enforce multivariate calibration during training for arbitrary pre-rank functions.<n>We show that our methods significantly improve calibration across all pre-rank functions without sacrificing predictive accuracy.
arXiv Detail & Related papers (2025-10-24T09:16:12Z) - Improved probabilistic regression using diffusion models [16.918373481904755]
We propose a novel diffusion-based framework for probabilistic regression that learns predictive distributions in a non way.<n>We investigate different noise parameterizations, analyze their trade-offs, and evaluate our framework across a broad range of regression tasks.<n>For several experiments, our approach shows superior performance against existing baselines, while delivering calibrated uncertainty estimates.
arXiv Detail & Related papers (2025-10-06T08:36:05Z) - Multivariate Latent Recalibration for Conditional Normalizing Flows [6.932606401614012]
latent recalibration learns a transformation of the latent space with finite-sample bounds on latent calibration.<n>LR consistently improves latent calibration error and the negative log-likelihood of the recalibrated models.
arXiv Detail & Related papers (2025-05-22T13:08:20Z) - Pre-validation Revisited [79.92204034170092]
We show properties and benefits of pre-validation in prediction, inference and error estimation by simulations and applications.<n>We propose not only an analytical distribution of the test statistic for the pre-validated predictor under certain models, but also a generic bootstrap procedure to conduct inference.
arXiv Detail & Related papers (2025-05-21T00:20:14Z) - Domain-adaptive and Subgroup-specific Cascaded Temperature Regression
for Out-of-distribution Calibration [16.930766717110053]
We propose a novel meta-set-based cascaded temperature regression method for post-hoc calibration.
We partition each meta-set into subgroups based on predicted category and confidence level, capturing diverse uncertainties.
A regression network is then trained to derive category-specific and confidence-level-specific scaling, achieving calibration across meta-sets.
arXiv Detail & Related papers (2024-02-14T14:35:57Z) - Calibration by Distribution Matching: Trainable Kernel Calibration
Metrics [56.629245030893685]
We introduce kernel-based calibration metrics that unify and generalize popular forms of calibration for both classification and regression.
These metrics admit differentiable sample estimates, making it easy to incorporate a calibration objective into empirical risk minimization.
We provide intuitive mechanisms to tailor calibration metrics to a decision task, and enforce accurate loss estimation and no regret decisions.
arXiv Detail & Related papers (2023-10-31T06:19:40Z) - Selective Nonparametric Regression via Testing [54.20569354303575]
We develop an abstention procedure via testing the hypothesis on the value of the conditional variance at a given point.
Unlike existing methods, the proposed one allows to account not only for the value of the variance itself but also for the uncertainty of the corresponding variance predictor.
arXiv Detail & Related papers (2023-09-28T13:04:11Z) - Quantification of Predictive Uncertainty via Inference-Time Sampling [57.749601811982096]
We propose a post-hoc sampling strategy for estimating predictive uncertainty accounting for data ambiguity.
The method can generate different plausible outputs for a given input and does not assume parametric forms of predictive distributions.
arXiv Detail & Related papers (2023-08-03T12:43:21Z) - Sharp Calibrated Gaussian Processes [58.94710279601622]
State-of-the-art approaches for designing calibrated models rely on inflating the Gaussian process posterior variance.
We present a calibration approach that generates predictive quantiles using a computation inspired by the vanilla Gaussian process posterior variance.
Our approach is shown to yield a calibrated model under reasonable assumptions.
arXiv Detail & Related papers (2023-02-23T12:17:36Z) - Individual Calibration with Randomized Forecasting [116.2086707626651]
We show that calibration for individual samples is possible in the regression setup if the predictions are randomized.
We design a training objective to enforce individual calibration and use it to train randomized regression functions.
arXiv Detail & Related papers (2020-06-18T05:53:10Z) - Variational Variance: Simple, Reliable, Calibrated Heteroscedastic Noise
Variance Parameterization [3.553493344868413]
We propose critiques to test predictive mean and variance calibration and the predictive distribution's ability to generate sensible data.
We find that our solution, to treat heteroscedastic variance variationally, sufficiently regularizes variance to pass these PPCs.
arXiv Detail & Related papers (2020-06-08T19:58:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.