Usage of specific attention improves change point detection
- URL: http://arxiv.org/abs/2204.08175v1
- Date: Mon, 18 Apr 2022 06:05:50 GMT
- Title: Usage of specific attention improves change point detection
- Authors: Anna Dmitrienko, Evgenia Romanenkova, Alexey Zaytsev
- Abstract summary: We investigate different attentions for the change point detection task and proposed specific form of attention related to the task at hand.
We show that using a special form of attention outperforms state-of-the-art results.
- Score: 1.0723143072368782
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: The change point is a moment of an abrupt alteration in the data
distribution. Current methods for change point detection are based on recurrent
neural methods suitable for sequential data. However, recent works show that
transformers based on attention mechanisms perform better than standard
recurrent models for many tasks. The most benefit is noticeable in the case of
longer sequences. In this paper, we investigate different attentions for the
change point detection task and proposed specific form of attention related to
the task at hand. We show that using a special form of attention outperforms
state-of-the-art results.
Related papers
- Localized Gaussians as Self-Attention Weights for Point Clouds Correspondence [92.07601770031236]
We investigate semantically meaningful patterns in the attention heads of an encoder-only Transformer architecture.
We find that fixing the attention weights not only accelerates the training process but also enhances the stability of the optimization.
arXiv Detail & Related papers (2024-09-20T07:41:47Z) - Enhancing Changepoint Detection: Penalty Learning through Deep Learning Techniques [2.094821665776961]
This study introduces a novel deep learning method for predicting penalty parameters.
It leads to demonstrably improved changepoint detection accuracy on large benchmark supervised labeled datasets.
arXiv Detail & Related papers (2024-08-01T18:10:05Z) - Sub-Adjacent Transformer: Improving Time Series Anomaly Detection with Reconstruction Error from Sub-Adjacent Neighborhoods [22.49176231245093]
We present the Sub-Adjacent Transformer with a novel attention mechanism for unsupervised time series anomaly detection.
By focusing the attention on the sub-adjacent areas, we make the reconstruction of anomalies more challenging.
The Sub-Adjacent Transformer achieves state-of-the-art performance across six real-world anomaly detection benchmarks.
arXiv Detail & Related papers (2024-04-27T08:08:17Z) - Change points detection in crime-related time series: an on-line fuzzy
approach based on a shape space representation [0.0]
We propose an on-line method for detecting and querying change points in crime-related time series.
The method is able to accurately detect change points at very low computational costs.
arXiv Detail & Related papers (2023-12-18T10:49:03Z) - Sequential Attention Source Identification Based on Feature
Representation [88.05527934953311]
This paper proposes a sequence-to-sequence based localization framework called Temporal-sequence based Graph Attention Source Identification (TGASI) based on an inductive learning idea.
It's worth mentioning that the inductive learning idea ensures that TGASI can detect the sources in new scenarios without knowing other prior knowledge.
arXiv Detail & Related papers (2023-06-28T03:00:28Z) - Invariant Causal Mechanisms through Distribution Matching [86.07327840293894]
In this work we provide a causal perspective and a new algorithm for learning invariant representations.
Empirically we show that this algorithm works well on a diverse set of tasks and in particular we observe state-of-the-art performance on domain generalization.
arXiv Detail & Related papers (2022-06-23T12:06:54Z) - Deep learning model solves change point detection for multiple change
types [69.77452691994712]
A change points detection aims to catch an abrupt disorder in data distribution.
We propose an approach that works in the multiple-distributions scenario.
arXiv Detail & Related papers (2022-04-15T09:44:21Z) - Learning Sinkhorn divergences for supervised change point detection [24.30834981766022]
We present a novel change point detection framework that uses true change point instances as supervision for learning a ground metric.
Our method can be used to learn a sparse metric which can be useful for both feature selection and interpretation.
arXiv Detail & Related papers (2022-02-08T17:11:40Z) - Self-Attention Neural Bag-of-Features [103.70855797025689]
We build on the recently introduced 2D-Attention and reformulate the attention learning methodology.
We propose a joint feature-temporal attention mechanism that learns a joint 2D attention mask highlighting relevant information.
arXiv Detail & Related papers (2022-01-26T17:54:14Z) - WATCH: Wasserstein Change Point Detection for High-Dimensional Time
Series Data [4.228718402877829]
Change point detection methods have the ability to discover changes in an unsupervised fashion.
We propose WATCH, a novel Wasserstein distance-based change point detection approach.
An extensive evaluation shows that WATCH is capable of accurately identifying change points and outperforming state-of-the-art methods.
arXiv Detail & Related papers (2022-01-18T16:55:29Z) - Deep learning approaches to Earth Observation change detection [0.0]
We present two approaches to change detection (semantic segmentation and classification) that both exploit convolutional neural networks to achieve good results.
In this paper we present two different approaches to change detection (semantic segmentation and classification) that both exploit convolutional neural networks to achieve good results.
arXiv Detail & Related papers (2021-07-13T14:34:59Z) - Change Point Detection in Time Series Data using Autoencoders with a
Time-Invariant Representation [69.34035527763916]
Change point detection (CPD) aims to locate abrupt property changes in time series data.
Recent CPD methods demonstrated the potential of using deep learning techniques, but often lack the ability to identify more subtle changes in the autocorrelation statistics of the signal.
We employ an autoencoder-based methodology with a novel loss function, through which the used autoencoders learn a partially time-invariant representation that is tailored for CPD.
arXiv Detail & Related papers (2020-08-21T15:03:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.