Deep Attentive Time Warping
- URL: http://arxiv.org/abs/2309.06720v1
- Date: Wed, 13 Sep 2023 04:49:49 GMT
- Title: Deep Attentive Time Warping
- Authors: Shinnosuke Matsuo, Xiaomeng Wu, Gantugs Atarsaikhan, Akisato Kimura,
Kunio Kashino, Brian Kenji Iwana, Seiichi Uchida
- Abstract summary: We propose a neural network model for task-adaptive time warping.
We use the attention model, called the bipartite attention model, to develop an explicit time warping mechanism.
Unlike other learnable models using DTW for warping, our model predicts all local correspondences between two time series.
- Score: 22.411355064531143
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Similarity measures for time series are important problems for time series
classification. To handle the nonlinear time distortions, Dynamic Time Warping
(DTW) has been widely used. However, DTW is not learnable and suffers from a
trade-off between robustness against time distortion and discriminative power.
In this paper, we propose a neural network model for task-adaptive time
warping. Specifically, we use the attention model, called the bipartite
attention model, to develop an explicit time warping mechanism with greater
distortion invariance. Unlike other learnable models using DTW for warping, our
model predicts all local correspondences between two time series and is trained
based on metric learning, which enables it to learn the optimal data-dependent
warping for the target task. We also propose to induce pre-training of our
model by DTW to improve the discriminative power. Extensive experiments
demonstrate the superior effectiveness of our model over DTW and its
state-of-the-art performance in online signature verification.
Related papers
- TimeDART: A Diffusion Autoregressive Transformer for Self-Supervised Time Series Representation [47.58016750718323]
We propose TimeDART, a novel self-supervised time series pre-training framework.
TimeDART unifies two powerful generative paradigms to learn more transferable representations.
We conduct extensive experiments on public datasets for time series forecasting and classification.
arXiv Detail & Related papers (2024-10-08T06:08:33Z) - Adaptive Training Meets Progressive Scaling: Elevating Efficiency in Diffusion Models [52.1809084559048]
We propose a novel two-stage divide-and-conquer training strategy termed TDC Training.
It groups timesteps based on task similarity and difficulty, assigning highly customized denoising models to each group, thereby enhancing the performance of diffusion models.
While two-stage training avoids the need to train each model separately, the total training cost is even lower than training a single unified denoising model.
arXiv Detail & Related papers (2023-12-20T03:32:58Z) - Approximating DTW with a convolutional neural network on EEG data [9.409281517596396]
We propose a fast and differentiable approximation of Dynamic Time Wrapping (DTW)
We show that our methods achieve at least the same level of accuracy as other DTW main approximations with higher computational efficiency.
arXiv Detail & Related papers (2023-01-30T13:27:47Z) - Gait Recognition in the Wild with Multi-hop Temporal Switch [81.35245014397759]
gait recognition in the wild is a more practical problem that has attracted the attention of the community of multimedia and computer vision.
This paper presents a novel multi-hop temporal switch method to achieve effective temporal modeling of gait patterns in real-world scenes.
arXiv Detail & Related papers (2022-09-01T10:46:09Z) - Robust Time Series Dissimilarity Measure for Outlier Detection and
Periodicity Detection [16.223509730658513]
We propose a novel time series dissimilarity measure named RobustDTW to reduce the effects of noises and outliers.
Specifically, the RobustDTW estimates the trend and optimize the time warp in an alternating manner by utilizing our designed temporal graph trend filtering.
Experiments on real-world datasets demonstrate the superior performance of RobustDTW compared to DTW variants in both outlier time series detection and periodicity detection.
arXiv Detail & Related papers (2022-06-07T00:49:16Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Voice2Series: Reprogramming Acoustic Models for Time Series
Classification [65.94154001167608]
Voice2Series is a novel end-to-end approach that reprograms acoustic models for time series classification.
We show that V2S either outperforms or is tied with state-of-the-art methods on 20 tasks, and improves their average accuracy by 1.84%.
arXiv Detail & Related papers (2021-06-17T07:59:15Z) - Attention to Warp: Deep Metric Learning for Multivariate Time Series [28.540348999309547]
This paper proposes a novel neural network-based approach for robust yet discriminative time series classification and verification.
We experimentally demonstrate the superiority of the proposed approach over previous non-parametric and deep models.
arXiv Detail & Related papers (2021-03-28T07:54:01Z) - Learning Discriminative Prototypes with Dynamic Time Warping [49.03785686097989]
We propose Discriminative Prototype DTW (DP-DTW), a novel method to learn class-specific discriminative prototypes for temporal recognition tasks.
DP-DTW shows superior performance compared to conventional DTWs on time series classification benchmarks.
arXiv Detail & Related papers (2021-03-17T06:11:11Z) - A Case-Study on the Impact of Dynamic Time Warping in Time Series
Regression [2.639737913330821]
We show that Dynamic Time Warping (DTW) is effective in improving accuracy on a regression task when only a single wavelength is considered.
When combined with k-Nearest Neighbour, DTW has the added advantage that it can reveal similarities and differences between samples at the level of the time-series.
However, in the problem, we consider here data is available across a spectrum of wavelengths.
arXiv Detail & Related papers (2020-10-11T15:21:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.