MD-split+: Practical Local Conformal Inference in High Dimensions
- URL: http://arxiv.org/abs/2107.03280v1
- Date: Wed, 7 Jul 2021 15:19:16 GMT
- Title: MD-split+: Practical Local Conformal Inference in High Dimensions
- Authors: Benjamin LeRoy and David Zhao
- Abstract summary: MD-split+ is a practical local conformal approach that creates X partitions based on localized model performance.
We discuss how our local partitions philosophically align with expected behavior from an unattainable conditional conformal inference approach.
- Score: 0.5439020425819
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Quantifying uncertainty in model predictions is a common goal for
practitioners seeking more than just point predictions. One tool for
uncertainty quantification that requires minimal assumptions is conformal
inference, which can help create probabilistically valid prediction regions for
black box models. Classical conformal prediction only provides marginal
validity, whereas in many situations locally valid prediction regions are
desirable. Deciding how best to partition the feature space X when applying
localized conformal prediction is still an open question. We present MD-split+,
a practical local conformal approach that creates X partitions based on
localized model performance of conditional density estimation models. Our
method handles complex real-world data settings where such models may be
misspecified, and scales to high-dimensional inputs. We discuss how our local
partitions philosophically align with expected behavior from an unattainable
conditional conformal inference approach. We also empirically compare our
method against other local conformal approaches.
Related papers
- Probabilistic Conformal Prediction with Approximate Conditional Validity [81.30551968980143]
We develop a new method for generating prediction sets that combines the flexibility of conformal methods with an estimate of the conditional distribution.
Our method consistently outperforms existing approaches in terms of conditional coverage.
arXiv Detail & Related papers (2024-07-01T20:44:48Z) - From Conformal Predictions to Confidence Regions [1.4272411349249627]
We introduce CCR, which employs a combination of conformal prediction intervals for the model outputs to establish confidence regions for model parameters.
We present coverage guarantees under minimal assumptions on noise and that is valid in finite sample regime.
Our approach is applicable to both split conformal predictions and black-box methodologies including full or cross-conformal approaches.
arXiv Detail & Related papers (2024-05-28T21:33:12Z) - Guarantee Regions for Local Explanations [29.429229877959663]
We propose an anchor-based algorithm for identifying regions in which local explanations are guaranteed to be correct.
Our method produces an interpretable feature-aligned box where the prediction of the local surrogate model is guaranteed to match the predictive model.
arXiv Detail & Related papers (2024-02-20T06:04:44Z) - Regression Trees for Fast and Adaptive Prediction Intervals [2.6763498831034043]
We present a family of methods to calibrate prediction intervals for regression problems with local coverage guarantees.
We create a partition by training regression trees and Random Forests on conformity scores.
Our proposal is versatile, as it applies to various conformity scores and prediction settings.
arXiv Detail & Related papers (2024-02-12T01:17:09Z) - Distribution-Free Conformal Joint Prediction Regions for Neural Marked Temporal Point Processes [4.324839843326325]
We develop more reliable methods for uncertainty in neural TPP models via the framework of conformal prediction.
A primary objective is to generate a distribution-free joint prediction region for an event's arrival time and mark, with a finite-sample marginal coverage guarantee.
arXiv Detail & Related papers (2024-01-09T15:28:29Z) - Multi-Modal Conformal Prediction Regions with Simple Structures by Optimizing Convex Shape Templates [19.504348671777006]
Conformal prediction is a statistical tool for producing prediction regions for machine learning models that are valid with high probability.
A key component of conformal prediction algorithms is a emphnon-conformity score function that quantifies how different a model's prediction is from the unknown ground truth value.
We propose a method that optimize parameterized emphshape template functions over calibration data, which results in non-conformity score functions that produce prediction regions with minimum volume.
arXiv Detail & Related papers (2023-12-12T17:00:13Z) - Calibrating Neural Simulation-Based Inference with Differentiable
Coverage Probability [50.44439018155837]
We propose to include a calibration term directly into the training objective of the neural model.
By introducing a relaxation of the classical formulation of calibration error we enable end-to-end backpropagation.
It is directly applicable to existing computational pipelines allowing reliable black-box posterior inference.
arXiv Detail & Related papers (2023-10-20T10:20:45Z) - RbX: Region-based explanations of prediction models [69.3939291118954]
Region-based explanations (RbX) is a model-agnostic method to generate local explanations of scalar outputs from a black-box prediction model.
RbX is guaranteed to satisfy a "sparsity axiom," which requires that features which do not enter into the prediction model are assigned zero importance.
arXiv Detail & Related papers (2022-10-17T03:38:06Z) - Predictive Inference with Feature Conformal Prediction [80.77443423828315]
We propose feature conformal prediction, which extends the scope of conformal prediction to semantic feature spaces.
From a theoretical perspective, we demonstrate that feature conformal prediction provably outperforms regular conformal prediction under mild assumptions.
Our approach could be combined with not only vanilla conformal prediction, but also other adaptive conformal prediction methods.
arXiv Detail & Related papers (2022-10-01T02:57:37Z) - Dense Uncertainty Estimation [62.23555922631451]
In this paper, we investigate neural networks and uncertainty estimation techniques to achieve both accurate deterministic prediction and reliable uncertainty estimation.
We work on two types of uncertainty estimations solutions, namely ensemble based methods and generative model based methods, and explain their pros and cons while using them in fully/semi/weakly-supervised framework.
arXiv Detail & Related papers (2021-10-13T01:23:48Z) - CovarianceNet: Conditional Generative Model for Correct Covariance
Prediction in Human Motion Prediction [71.31516599226606]
We present a new method to correctly predict the uncertainty associated with the predicted distribution of future trajectories.
Our approach, CovariaceNet, is based on a Conditional Generative Model with Gaussian latent variables.
arXiv Detail & Related papers (2021-09-07T09:38:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.