Sequential change detection via backward confidence sequences
- URL: http://arxiv.org/abs/2302.02544v1
- Date: Mon, 6 Feb 2023 03:03:24 GMT
- Title: Sequential change detection via backward confidence sequences
- Authors: Shubhanshu Shekhar, Aaditya Ramdas
- Abstract summary: We present a simple reduction from sequential estimation to sequential changepoint detection.
We provide strong nonasymptotic guarantees on the frequency of false alarms and detection delay.
- Score: 40.79325752924537
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present a simple reduction from sequential estimation to sequential
changepoint detection (SCD). In short, suppose we are interested in detecting
changepoints in some parameter or functional $\theta$ of the underlying
distribution. We demonstrate that if we can construct a confidence sequence
(CS) for $\theta$, then we can also successfully perform SCD for $\theta$. This
is accomplished by checking if two CSs -- one forwards and the other backwards
-- ever fail to intersect. Since the literature on CSs has been rapidly
evolving recently, the reduction provided in this paper immediately solves
several old and new change detection problems. Further, our "backward CS",
constructed by reversing time, is new and potentially of independent interest.
We provide strong nonasymptotic guarantees on the frequency of false alarms and
detection delay, and demonstrate numerical effectiveness on several problems.
Related papers
- Quickest Change Detection with Confusing Change [26.769246781414545]
This work studies a QCD problem where the change is either a bad change, which we aim to detect, or a confusing change, which is not of our interest.
We propose novel CuSum-based detection procedures, S-CuSum and J-CuSum, leveraging two CuSum statistics.
For both S-CuSum and J-CuSum, we provide analytical performance guarantees and validate them by numerical results.
arXiv Detail & Related papers (2024-05-01T20:10:06Z) - Reducing sequential change detection to sequential estimation [42.460619457560334]
We describe a simple reduction from sequential change detection to sequential estimation using confidence sequences.
We prove that the average run length is at least $1/alpha$, resulting in a change detection scheme with minimal structural assumptions.
arXiv Detail & Related papers (2023-09-16T23:48:47Z) - Discounted Thompson Sampling for Non-Stationary Bandit Problems [13.656518163592349]
Non-stationary multi-armed bandit (NS-MAB) problems have recently received significant attention.
We propose Discounted Thompson Sampling (DS-TS) with Gaussian priors to address both non-stationary settings.
Our algorithm passively adapts to changes by incorporating a discounted factor into Thompson Sampling.
arXiv Detail & Related papers (2023-05-18T05:29:52Z) - Sequential Gradient Descent and Quasi-Newton's Method for Change-Point
Analysis [0.348097307252416]
One common approach to detecting change-points is minimizing a cost function over possible numbers and locations of change-points.
This paper introduces a new sequential method (SE) that can be coupled with gradient descent (SeGD) and quasi-Newton's method (SeN) to find the cost value effectively.
arXiv Detail & Related papers (2022-10-21T20:30:26Z) - E-detectors: a nonparametric framework for sequential change detection [86.15115654324488]
We develop a fundamentally new and general framework for sequential change detection.
Our procedures come with clean, nonasymptotic bounds on the average run length.
We show how to design their mixtures in order to achieve both statistical and computational efficiency.
arXiv Detail & Related papers (2022-03-07T17:25:02Z) - Change Point Detection in Time Series Data using Autoencoders with a
Time-Invariant Representation [69.34035527763916]
Change point detection (CPD) aims to locate abrupt property changes in time series data.
Recent CPD methods demonstrated the potential of using deep learning techniques, but often lack the ability to identify more subtle changes in the autocorrelation statistics of the signal.
We employ an autoencoder-based methodology with a novel loss function, through which the used autoencoders learn a partially time-invariant representation that is tailored for CPD.
arXiv Detail & Related papers (2020-08-21T15:03:21Z) - On the Almost Sure Convergence of Stochastic Gradient Descent in
Non-Convex Problems [75.58134963501094]
This paper analyzes the trajectories of gradient descent (SGD)
We show that SGD avoids saddle points/manifolds with $1$ for strict step-size policies.
arXiv Detail & Related papers (2020-06-19T14:11:26Z) - Optimal Change-Point Detection with Training Sequences in the Large and
Moderate Deviations Regimes [72.68201611113673]
This paper investigates a novel offline change-point detection problem from an information-theoretic perspective.
We assume that the knowledge of the underlying pre- and post-change distributions are not known and can only be learned from the training sequences which are available.
arXiv Detail & Related papers (2020-03-13T23:39:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.