Conformal changepoint localization
- URL: http://arxiv.org/abs/2602.06267v1
- Date: Thu, 05 Feb 2026 23:50:53 GMT
- Title: Conformal changepoint localization
- Authors: Rohan Hore, Aaditya Ramdas,
- Abstract summary: We study the problem of offline changepoint localization in a distribution-free setting.<n>We produce a finite-sample confidence set for the index at which the change occurs without making any other assumptions.<n>Experiments suggest that CONCH delivers precise confidence sets even in challenging settings involving images or text.
- Score: 36.191356601153146
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We study the problem of offline changepoint localization in a distribution-free setting. One observes a vector of data with a single changepoint, assuming that the data before and after the changepoint are iid (or more generally exchangeable) from arbitrary and unknown distributions. The goal is to produce a finite-sample confidence set for the index at which the change occurs without making any other assumptions. Existing methods often rely on parametric assumptions, tail conditions, or asymptotic approximations, or only produce point estimates. In contrast, our distribution-free algorithm, CONformal CHangepoint localization (CONCH), only leverages exchangeability arguments to construct confidence sets with finite sample coverage. By proving a conformal Neyman-Pearson lemma, we derive principled score functions that yield informative (small) sets. Moreover, with such score functions, the normalized length of the confidence set shrinks to zero under weak assumptions. We also establish a universality result showing that any distribution-free changepoint localization method must be an instance of CONCH. Experiments suggest that CONCH delivers precise confidence sets even in challenging settings involving images or text.
Related papers
- Theoretical guarantees for change localization using conformal p-values [36.191356601153146]
We provide theoretical guarantees for a method to achieve distribution-free changepoint localization with finite-sample validity.<n>We also present various finite sample and properties of the conformal $p$-value in the distribution change setup.<n>Our contributions offer a comprehensive and theoretically principled approach to distribution-free changepoint inference.
arXiv Detail & Related papers (2025-10-09T19:05:47Z) - Post-detection inference for sequential changepoint localization [29.43493007296859]
We develop a framework to construct confidence sets for the unknown changepoint using only the data observed up to a data-dependent stopping time at which an arbitrary sequential detection algorithm declares a change.<n>Our framework is nonparametric, making no assumption on the composite post-change class, the observation space, or the sequential detection procedure used, and is nonasymptotically valid.
arXiv Detail & Related papers (2025-02-10T02:01:30Z) - SoftCVI: Contrastive variational inference with self-generated soft labels [2.5398014196797614]
Variational inference and Markov chain Monte Carlo methods are the predominant tools for this task.<n>We introduce Soft Contrastive Variational Inference (SoftCVI), which allows a family of variational objectives to be derived through a contrastive estimation framework.<n>We find that SoftCVI can be used to form objectives which are stable to train and mass-covering, frequently outperforming inference with other variational approaches.
arXiv Detail & Related papers (2024-07-22T14:54:12Z) - Predicting Out-of-Domain Generalization with Neighborhood Invariance [59.05399533508682]
We propose a measure of a classifier's output invariance in a local transformation neighborhood.
Our measure is simple to calculate, does not depend on the test point's true label, and can be applied even in out-of-domain (OOD) settings.
In experiments on benchmarks in image classification, sentiment analysis, and natural language inference, we demonstrate a strong and robust correlation between our measure and actual OOD generalization.
arXiv Detail & Related papers (2022-07-05T14:55:16Z) - Deep learning model solves change point detection for multiple change
types [69.77452691994712]
A change points detection aims to catch an abrupt disorder in data distribution.
We propose an approach that works in the multiple-distributions scenario.
arXiv Detail & Related papers (2022-04-15T09:44:21Z) - E-detectors: a nonparametric framework for sequential change detection [86.15115654324488]
We develop a fundamentally new and general framework for sequential change detection.
Our procedures come with clean, nonasymptotic bounds on the average run length.
We show how to design their mixtures in order to achieve both statistical and computational efficiency.
arXiv Detail & Related papers (2022-03-07T17:25:02Z) - Stochastic Saddle Point Problems with Decision-Dependent Distributions [0.6091702876917279]
This paper focuses on saddle point problems with decision-dependent in both the static and time-varying settings.
We introduce the notion of equilibrium points -- which are saddle points for the stationary minimax problem.
We show that primal-dual algorithms converge to saddle points in a similar fashion.
arXiv Detail & Related papers (2022-01-07T03:36:41Z) - Cross-validation for change-point regression: pitfalls and solutions [0.0]
We show that the problems of cross-validation with squared error loss are more severe and can lead to systematic under- or over-estimation of the number of change-points.
We propose two simple approaches to remedy these issues, the first involving the use of absolute error rather than squared error loss.
We show these conditions are satisfied for at least squares estimation using new results on its performance when supplied with the incorrect number of change-points.
arXiv Detail & Related papers (2021-12-06T18:23:12Z) - Tackling the Objective Inconsistency Problem in Heterogeneous Federated
Optimization [93.78811018928583]
This paper provides a framework to analyze the convergence of federated heterogeneous optimization algorithms.
We propose FedNova, a normalized averaging method that eliminates objective inconsistency while preserving fast error convergence.
arXiv Detail & Related papers (2020-07-15T05:01:23Z) - Distribution-free binary classification: prediction sets, confidence
intervals and calibration [106.50279469344937]
We study three notions of uncertainty quantification -- calibration, confidence intervals and prediction sets -- for binary classification in the distribution-free setting.
We derive confidence intervals for binned probabilities for both fixed-width and uniform-mass binning.
As a consequence of our 'tripod' theorems, these confidence intervals for binned probabilities lead to distribution-free calibration.
arXiv Detail & Related papers (2020-06-18T14:17:29Z) - Optimal Change-Point Detection with Training Sequences in the Large and
Moderate Deviations Regimes [72.68201611113673]
This paper investigates a novel offline change-point detection problem from an information-theoretic perspective.
We assume that the knowledge of the underlying pre- and post-change distributions are not known and can only be learned from the training sequences which are available.
arXiv Detail & Related papers (2020-03-13T23:39:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.