Dynamic Time Warping based Adversarial Framework for Time-Series Domain
- URL: http://arxiv.org/abs/2207.04308v1
- Date: Sat, 9 Jul 2022 17:23:00 GMT
- Title: Dynamic Time Warping based Adversarial Framework for Time-Series Domain
- Authors: Taha Belkhouja, Yan Yan, Janardhan Rao Doppa
- Abstract summary: We propose a novel framework for the time-series domain referred as Dynamic Time Warping for Adrial Robustness (DTW-AR)
We develop a principled algorithm justified by theoretical analysis to efficiently create diverse adversarial examples using random alignment paths.
Experiments on diverse real-world benchmarks show the effectiveness of DTW-AR to fool DNNs for time-series data and to improve their robustness using adversarial training.
- Score: 32.45387153404849
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Despite the rapid progress on research in adversarial robustness of deep
neural networks (DNNs), there is little principled work for the time-series
domain. Since time-series data arises in diverse applications including mobile
health, finance, and smart grid, it is important to verify and improve the
robustness of DNNs for the time-series domain. In this paper, we propose a
novel framework for the time-series domain referred as {\em Dynamic Time
Warping for Adversarial Robustness (DTW-AR)} using the dynamic time warping
measure. Theoretical and empirical evidence is provided to demonstrate the
effectiveness of DTW over the standard Euclidean distance metric employed in
prior methods for the image domain. We develop a principled algorithm justified
by theoretical analysis to efficiently create diverse adversarial examples
using random alignment paths. Experiments on diverse real-world benchmarks show
the effectiveness of DTW-AR to fool DNNs for time-series data and to improve
their robustness using adversarial training. The source code of DTW-AR
algorithms is available at https://github.com/tahabelkhouja/DTW-AR
Related papers
- Generative Modeling of Regular and Irregular Time Series Data via Koopman VAEs [50.25683648762602]
We introduce Koopman VAE, a new generative framework that is based on a novel design for the model prior.
Inspired by Koopman theory, we represent the latent conditional prior dynamics using a linear map.
KoVAE outperforms state-of-the-art GAN and VAE methods across several challenging synthetic and real-world time series generation benchmarks.
arXiv Detail & Related papers (2023-10-04T07:14:43Z) - OTW: Optimal Transport Warping for Time Series [75.69837166816501]
Dynamic Time Warping (DTW) has become the pragmatic choice for measuring distance between time series.
It suffers from unavoidable quadratic time complexity when the optimal alignment matrix needs to be computed exactly.
We introduce a new metric for time series data based on the Optimal Transport framework, called Optimal Transport Warping (OTW)
arXiv Detail & Related papers (2023-06-01T12:45:00Z) - Deep Declarative Dynamic Time Warping for End-to-End Learning of
Alignment Paths [54.53208538517505]
This paper addresses learning end-to-end models for time series data that include a temporal alignment step via dynamic time warping (DTW)
We propose a DTW layer based around bi-level optimisation and deep declarative networks, which we name DecDTW.
We show that this property is particularly useful for applications where downstream loss functions are defined on the optimal alignment path itself.
arXiv Detail & Related papers (2023-03-19T21:58:37Z) - Approximating DTW with a convolutional neural network on EEG data [9.409281517596396]
We propose a fast and differentiable approximation of Dynamic Time Wrapping (DTW)
We show that our methods achieve at least the same level of accuracy as other DTW main approximations with higher computational efficiency.
arXiv Detail & Related papers (2023-01-30T13:27:47Z) - HyperTime: Implicit Neural Representation for Time Series [131.57172578210256]
Implicit neural representations (INRs) have recently emerged as a powerful tool that provides an accurate and resolution-independent encoding of data.
In this paper, we analyze the representation of time series using INRs, comparing different activation functions in terms of reconstruction accuracy and training convergence speed.
We propose a hypernetwork architecture that leverages INRs to learn a compressed latent representation of an entire time series dataset.
arXiv Detail & Related papers (2022-08-11T14:05:51Z) - Training Robust Deep Models for Time-Series Domain: Novel Algorithms and
Theoretical Analysis [32.45387153404849]
We propose a novel framework referred as RObust Training for Time-Series (RO-TS) to create robust DNNs for time-series classification tasks.
We show the generality and advantages of our formulation using the summation structure over time-series alignments.
Our experiments on real-world benchmarks demonstrate that RO-TS creates more robust DNNs when compared to adversarial training.
arXiv Detail & Related papers (2022-07-09T17:21:03Z) - Robust Time Series Dissimilarity Measure for Outlier Detection and
Periodicity Detection [16.223509730658513]
We propose a novel time series dissimilarity measure named RobustDTW to reduce the effects of noises and outliers.
Specifically, the RobustDTW estimates the trend and optimize the time warp in an alternating manner by utilizing our designed temporal graph trend filtering.
Experiments on real-world datasets demonstrate the superior performance of RobustDTW compared to DTW variants in both outlier time series detection and periodicity detection.
arXiv Detail & Related papers (2022-06-07T00:49:16Z) - Averaging Spatio-temporal Signals using Optimal Transport and Soft
Alignments [110.79706180350507]
We show that our proposed loss can be used to define temporal-temporal baryechecenters as Fr'teche means duality.
Experiments on handwritten letters and brain imaging data confirm our theoretical findings.
arXiv Detail & Related papers (2022-03-11T09:46:22Z) - Matrix Profile XXII: Exact Discovery of Time Series Motifs under DTW [1.282368486390644]
We present the first scalable exact method to discover time series motifs under Dynamic Time Warping.
Our algorithm can admissibly prune up to 99.99% of the DTW computations.
arXiv Detail & Related papers (2020-09-16T19:35:43Z) - SRDCNN: Strongly Regularized Deep Convolution Neural Network
Architecture for Time-series Sensor Signal Classification Tasks [4.950427992960756]
We present SRDCNN: Strongly Regularized Deep Convolution Neural Network (DCNN) based deep architecture to perform time series classification tasks.
The novelty of the proposed approach is that the network weights are regularized by both L1 and L2 norm penalties.
arXiv Detail & Related papers (2020-07-14T08:42:39Z) - GraN: An Efficient Gradient-Norm Based Detector for Adversarial and
Misclassified Examples [77.99182201815763]
Deep neural networks (DNNs) are vulnerable to adversarial examples and other data perturbations.
GraN is a time- and parameter-efficient method that is easily adaptable to any DNN.
GraN achieves state-of-the-art performance on numerous problem set-ups.
arXiv Detail & Related papers (2020-04-20T10:09:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.