Quickest Change Detection with Confusing Change
- URL: http://arxiv.org/abs/2405.00842v1
- Date: Wed, 1 May 2024 20:10:06 GMT
- Title: Quickest Change Detection with Confusing Change
- Authors: Yu-Zhen Janice Chen, Jinhang Zuo, Venugopal V. Veeravalli, Don Towsley,
- Abstract summary: This work studies a QCD problem where the change is either a bad change, which we aim to detect, or a confusing change, which is not of our interest.
We propose novel CuSum-based detection procedures, S-CuSum and J-CuSum, leveraging two CuSum statistics.
For both S-CuSum and J-CuSum, we provide analytical performance guarantees and validate them by numerical results.
- Score: 26.769246781414545
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: In the problem of quickest change detection (QCD), a change occurs at some unknown time in the distribution of a sequence of independent observations. This work studies a QCD problem where the change is either a bad change, which we aim to detect, or a confusing change, which is not of our interest. Our objective is to detect a bad change as quickly as possible while avoiding raising a false alarm for pre-change or a confusing change. We identify a specific set of pre-change, bad change, and confusing change distributions that pose challenges beyond the capabilities of standard Cumulative Sum (CuSum) procedures. Proposing novel CuSum-based detection procedures, S-CuSum and J-CuSum, leveraging two CuSum statistics, we offer solutions applicable across all kinds of pre-change, bad change, and confusing change distributions. For both S-CuSum and J-CuSum, we provide analytical performance guarantees and validate them by numerical results. Furthermore, both procedures are computationally efficient as they only require simple recursive updates.
Related papers
- Segment Any Change [64.23961453159454]
We propose a new type of change detection model that supports zero-shot prediction and generalization on unseen change types and data distributions.
AnyChange is built on the segment anything model (SAM) via our training-free adaptation method, bitemporal latent matching.
We also propose a point query mechanism to enable AnyChange's zero-shot object-centric change detection capability.
arXiv Detail & Related papers (2024-02-02T07:17:39Z) - Reducing sequential change detection to sequential estimation [42.460619457560334]
We describe a simple reduction from sequential change detection to sequential estimation using confidence sequences.
We prove that the average run length is at least $1/alpha$, resulting in a change detection scheme with minimal structural assumptions.
arXiv Detail & Related papers (2023-09-16T23:48:47Z) - Sequential change detection via backward confidence sequences [40.79325752924537]
We present a simple reduction from sequential estimation to sequential changepoint detection.
We provide strong nonasymptotic guarantees on the frequency of false alarms and detection delay.
arXiv Detail & Related papers (2023-02-06T03:03:24Z) - E-detectors: a nonparametric framework for sequential change detection [86.15115654324488]
We develop a fundamentally new and general framework for sequential change detection.
Our procedures come with clean, nonasymptotic bounds on the average run length.
We show how to design their mixtures in order to achieve both statistical and computational efficiency.
arXiv Detail & Related papers (2022-03-07T17:25:02Z) - Cross-validation for change-point regression: pitfalls and solutions [0.0]
We show that the problems of cross-validation with squared error loss are more severe and can lead to systematic under- or over-estimation of the number of change-points.
We propose two simple approaches to remedy these issues, the first involving the use of absolute error rather than squared error loss.
We show these conditions are satisfied for at least squares estimation using new results on its performance when supplied with the incorrect number of change-points.
arXiv Detail & Related papers (2021-12-06T18:23:12Z) - Change Point Detection in Time Series Data using Autoencoders with a
Time-Invariant Representation [69.34035527763916]
Change point detection (CPD) aims to locate abrupt property changes in time series data.
Recent CPD methods demonstrated the potential of using deep learning techniques, but often lack the ability to identify more subtle changes in the autocorrelation statistics of the signal.
We employ an autoencoder-based methodology with a novel loss function, through which the used autoencoders learn a partially time-invariant representation that is tailored for CPD.
arXiv Detail & Related papers (2020-08-21T15:03:21Z) - Balancing Rates and Variance via Adaptive Batch-Size for Stochastic
Optimization Problems [120.21685755278509]
In this work, we seek to balance the fact that attenuating step-size is required for exact convergence with the fact that constant step-size learns faster in time up to an error.
Rather than fixing the minibatch the step-size at the outset, we propose to allow parameters to evolve adaptively.
arXiv Detail & Related papers (2020-07-02T16:02:02Z) - Robust Sampling in Deep Learning [62.997667081978825]
Deep learning requires regularization mechanisms to reduce overfitting and improve generalization.
We address this problem by a new regularization method based on distributional robust optimization.
During the training, the selection of samples is done according to their accuracy in such a way that the worst performed samples are the ones that contribute the most in the optimization.
arXiv Detail & Related papers (2020-06-04T09:46:52Z) - Optimal Change-Point Detection with Training Sequences in the Large and
Moderate Deviations Regimes [72.68201611113673]
This paper investigates a novel offline change-point detection problem from an information-theoretic perspective.
We assume that the knowledge of the underlying pre- and post-change distributions are not known and can only be learned from the training sequences which are available.
arXiv Detail & Related papers (2020-03-13T23:39:40Z) - Permutation Inference for Canonical Correlation Analysis [0.7646713951724012]
We show that a simple permutation test for canonical correlations leads to inflated error rates.
In the absence of nuisance variables, however, a simple permutation test for CCA also leads to excess error rates for all canonical correlations other than the first.
Here we show that transforming the residuals to a lower dimensional basis where exchangeability holds results in a valid permutation test.
arXiv Detail & Related papers (2020-02-24T02:47:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.