Federated Conditional Conformal Prediction via Generative Models
- URL: http://arxiv.org/abs/2510.13297v2
- Date: Mon, 20 Oct 2025 08:59:25 GMT
- Title: Federated Conditional Conformal Prediction via Generative Models
- Authors: Rui Xu, Xingyuan Chen, Wenxing Huang, Minxuan Huang, Yun Xie, Weiyan Chen, Sihong Xie,
- Abstract summary: Conformal Prediction (CP) provides distribution-free uncertainty quantification.<n>Fed Conditional Conformal Prediction (Fed-CCP) aims for conditional coverage that adapts to local data heterogeneity.<n> Experiments on real datasets demonstrate that Fed-CCP achieves more adaptive prediction sets.
- Score: 12.463514743585515
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Conformal Prediction (CP) provides distribution-free uncertainty quantification by constructing prediction sets that guarantee coverage of the true labels. This reliability makes CP valuable for high-stakes federated learning scenarios such as multi-center healthcare. However, standard CP assumes i.i.d. data, which is violated in federated settings where client distributions differ substantially. Existing federated CP methods address this by maintaining marginal coverage on each client, but such guarantees often fail to reflect input-conditional uncertainty. In this work, we propose Federated Conditional Conformal Prediction (Fed-CCP) via generative models, which aims for conditional coverage that adapts to local data heterogeneity. Fed-CCP leverages generative models, such as normalizing flows or diffusion models, to approximate conditional data distributions without requiring the sharing of raw data. This enables each client to locally calibrate conformal scores that reflect its unique uncertainty, while preserving global consistency through federated aggregation. Experiments on real datasets demonstrate that Fed-CCP achieves more adaptive prediction sets.
Related papers
- ST-BCP: Tightening Coverage Bound for Backward Conformal Prediction via Non-Conformity Score Transformation [18.272247805086284]
Conformal Prediction (CP) provides a statistical framework for uncertainty quantification that constructs prediction sets with coverage guarantees.<n>BCP inverts this paradigm by enforcing a predefined upper bound on set size and estimating the resulting coverage guarantee.<n>We introduce ST-BCP, a novel method that introduces a data-dependent transformation of nonconformity scores to narrow the coverage gap.
arXiv Detail & Related papers (2026-02-02T07:18:35Z) - Uncertainty Quantification for Named Entity Recognition via Full-Sequence and Subsequence Conformal Prediction [0.0]
We introduce a general framework for adapting sequence-labeling-based NER models to produce uncertainty-aware prediction sets.<n>Prediction sets are collections of full-sentence labelings guaranteed to contain the correct labeling with a user-specified confidence level.
arXiv Detail & Related papers (2026-01-13T18:00:08Z) - Distribution-informed Online Conformal Prediction [53.674678995825666]
We propose Conformal Optimistic Prediction (COP), an online conformal prediction algorithm incorporating underlying data pattern into the update rule.<n>COP produces tighter prediction sets when predictable pattern exists, while retaining valid coverage guarantees even when estimates are inaccurate.<n>We prove that COP can achieve valid coverage and construct shorter prediction intervals than other baselines.
arXiv Detail & Related papers (2025-12-08T17:51:49Z) - FedCF: Fair Federated Conformal Prediction [4.145290936792853]
We extend the Conformal Fairness (CF) framework to the Federated Learning setting and discuss how we can audit a federated model for fairness.<n>We empirically validate our framework by conducting experiments on several datasets spanning multiple domains.
arXiv Detail & Related papers (2025-09-26T20:35:22Z) - COIN: Uncertainty-Guarding Selective Question Answering for Foundation Models with Provable Risk Guarantees [51.5976496056012]
COIN is an uncertainty-guarding selection framework that calibrates statistically valid thresholds to filter a single generated answer per question.<n>COIN estimates the empirical error rate on a calibration set and applies confidence interval methods to establish a high-probability upper bound on the true error rate.<n>We demonstrate COIN's robustness in risk control, strong test-time power in retaining admissible answers, and predictive efficiency under limited calibration data.
arXiv Detail & Related papers (2025-06-25T07:04:49Z) - Distributed Conformal Prediction via Message Passing [33.306901198295016]
Conformal Prediction (CP) offers a robust post-hoc calibration framework.<n>We propose two message-passing-based approaches for achieving reliable inference via CP.
arXiv Detail & Related papers (2025-01-24T14:47:42Z) - Conformal Prediction Sets with Improved Conditional Coverage using Trust Scores [52.92618442300405]
It is impossible to achieve exact, distribution-free conditional coverage in finite samples.<n>We propose an alternative conformal prediction algorithm that targets coverage where it matters most.
arXiv Detail & Related papers (2025-01-17T12:01:56Z) - Certifiably Byzantine-Robust Federated Conformal Prediction [49.23374238798428]
We introduce a novel framework Rob-FCP, which executes robust federated conformal prediction effectively countering malicious clients.
We empirically demonstrate the robustness of Rob-FCP against diverse proportions of malicious clients under a variety of Byzantine attacks.
arXiv Detail & Related papers (2024-06-04T04:43:30Z) - Efficient Conformal Prediction under Data Heterogeneity [79.35418041861327]
Conformal Prediction (CP) stands out as a robust framework for uncertainty quantification.
Existing approaches for tackling non-exchangeability lead to methods that are not computable beyond the simplest examples.
This work introduces a new efficient approach to CP that produces provably valid confidence sets for fairly general non-exchangeable data distributions.
arXiv Detail & Related papers (2023-12-25T20:02:51Z) - Federated Conformal Predictors for Distributed Uncertainty
Quantification [83.50609351513886]
Conformal prediction is emerging as a popular paradigm for providing rigorous uncertainty quantification in machine learning.
In this paper, we extend conformal prediction to the federated learning setting.
We propose a weaker notion of partial exchangeability, better suited to the FL setting, and use it to develop the Federated Conformal Prediction framework.
arXiv Detail & Related papers (2023-05-27T19:57:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.