Deep learning model solves change point detection for multiple change
types
- URL: http://arxiv.org/abs/2204.07403v1
- Date: Fri, 15 Apr 2022 09:44:21 GMT
- Title: Deep learning model solves change point detection for multiple change
types
- Authors: Alexander Stepikin, Evgenia Romanenkova, Alexey Zaytsev
- Abstract summary: A change points detection aims to catch an abrupt disorder in data distribution.
We propose an approach that works in the multiple-distributions scenario.
- Score: 69.77452691994712
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: A change points detection aims to catch an abrupt disorder in data
distribution. Common approaches assume that there are only two fixed
distributions for data: one before and another after a change point. Real-world
data are richer than this assumption. There can be multiple different
distributions before and after a change. We propose an approach that works in
the multiple-distributions scenario. Our approach learn representations for
semi-structured data suitable for change point detection, while a common
classifiers-based approach fails. Moreover, our model is more robust, when
predicting change points. The datasets used for benchmarking are sequences of
images with and without change points in them.
Related papers
- Causal Discovery-Driven Change Point Detection in Time Series [32.424281626708336]
Change point detection in time series seeks to identify times when the probability distribution of time series changes.
In practical applications, we may be interested only in certain components of the time series, exploring abrupt changes in their distributions.
arXiv Detail & Related papers (2024-07-10T00:54:42Z) - Stylist: Style-Driven Feature Ranking for Robust Novelty Detection [8.402607231390606]
We propose to use the formalization of separating into semantic or content changes, that are relevant to our task, and style changes, that are irrelevant.
Within this formalization, we define the robust novelty detection as the task of finding semantic changes while being robust to style distributional shifts.
We show that our selection manages to remove features responsible for spurious correlations and improve novelty detection performance.
arXiv Detail & Related papers (2023-10-05T17:58:32Z) - Domain Adaptive Synapse Detection with Weak Point Annotations [63.97144211520869]
We present AdaSyn, a framework for domain adaptive synapse detection with weak point annotations.
In the WASPSYN challenge at I SBI 2023, our method ranks the 1st place.
arXiv Detail & Related papers (2023-08-31T05:05:53Z) - Change Point Detection with Conceptors [0.6526824510982799]
offline change point detection retrospectively locates change points in a time series.
Many nonparametric methods that target i.i.d. mean and variance changes fail in the presence of nonlinear temporal dependence.
We propose use of a conceptor matrix to learn the characteristic dynamics of a baseline training window with arbitrary dependence structure.
The associated echo state network acts as a featurizer of the data, and change points are identified from the nature of the interactions between the features and their relationship to the baseline state.
arXiv Detail & Related papers (2023-08-11T16:32:00Z) - Project and Probe: Sample-Efficient Domain Adaptation by Interpolating
Orthogonal Features [119.22672589020394]
We propose a lightweight, sample-efficient approach that learns a diverse set of features and adapts to a target distribution by interpolating these features.
Our experiments on four datasets, with multiple distribution shift settings for each, show that Pro$2$ improves performance by 5-15% when given limited target data.
arXiv Detail & Related papers (2023-02-10T18:58:03Z) - Predicting with Confidence on Unseen Distributions [90.68414180153897]
We connect domain adaptation and predictive uncertainty literature to predict model accuracy on challenging unseen distributions.
We find that the difference of confidences (DoC) of a classifier's predictions successfully estimates the classifier's performance change over a variety of shifts.
We specifically investigate the distinction between synthetic and natural distribution shifts and observe that despite its simplicity DoC consistently outperforms other quantifications of distributional difference.
arXiv Detail & Related papers (2021-07-07T15:50:18Z) - Principled change point detection via representation learning [0.6047855579999899]
We introduce a principled differentiable loss function that considers the specificity of the CPD task.
We propose an end-to-end method for the training of deep representation learning CPD models.
arXiv Detail & Related papers (2021-06-04T17:04:13Z) - Detection of data drift and outliers affecting machine learning model
performance over time [5.319802998033767]
Drift is distribution change between the training and deployment data.
We wish to detect these changes but can't measure accuracy without deployment data labels.
We instead detect drift indirectly by nonparametrically testing the distribution of model prediction confidence for changes.
arXiv Detail & Related papers (2020-12-16T20:50:12Z) - Meta-Learned Confidence for Few-shot Learning [60.6086305523402]
A popular transductive inference technique for few-shot metric-based approaches, is to update the prototype of each class with the mean of the most confident query examples.
We propose to meta-learn the confidence for each query sample, to assign optimal weights to unlabeled queries.
We validate our few-shot learning model with meta-learned confidence on four benchmark datasets.
arXiv Detail & Related papers (2020-02-27T10:22:17Z) - Generalized ODIN: Detecting Out-of-distribution Image without Learning
from Out-of-distribution Data [87.61504710345528]
We propose two strategies for freeing a neural network from tuning with OoD data, while improving its OoD detection performance.
We specifically propose to decompose confidence scoring as well as a modified input pre-processing method.
Our further analysis on a larger scale image dataset shows that the two types of distribution shifts, specifically semantic shift and non-semantic shift, present a significant difference.
arXiv Detail & Related papers (2020-02-26T04:18:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.