Classification of multivariate weakly-labelled time-series with
attention
- URL: http://arxiv.org/abs/2102.08245v1
- Date: Tue, 16 Feb 2021 16:05:38 GMT
- Title: Classification of multivariate weakly-labelled time-series with
attention
- Authors: Surayez Rahman, Chang Wei Tan
- Abstract summary: Weakly labelled time-series are time-series containing noise and significant redundancies.
This paper proposes an approach of exploiting context relevance of subsequences to improve classification accuracy.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This research identifies a gap in weakly-labelled multivariate time-series
classification (TSC), where state-of-the-art TSC models do not per-form well.
Weakly labelled time-series are time-series containing noise and significant
redundancies. In response to this gap, this paper proposes an approach of
exploiting context relevance of subsequences from previous subsequences to
improve classification accuracy. To achieve this, state-of-the-art Attention
algorithms are experimented in combination with the top CNN models for TSC (FCN
and ResNet), in an CNN-LSTM architecture. Attention is a popular strategy for
context extraction with exceptional performance in modern sequence-to-sequence
tasks. This paper shows how attention algorithms can be used for improved
weakly labelledTSC by evaluating models on a multivariate EEG time-series
dataset obtained using a commercial Emotiv headsets from participants
performing various activities while driving. These time-series are segmented
into sub-sequences and labelled to allow supervised TSC.
Related papers
- ECRTime: Ensemble Integration of Classification and Retrieval for Time Series Classification [6.058649579669944]
Experimental results on 112 UCR datasets demonstrate that ECR is state-of-the-art(sota) compared to existing deep learning-based methods.
ECRTime surpasses the currently most accurate deep learning classifier, InceptionTime, in terms of accuracy.
arXiv Detail & Related papers (2024-07-20T03:17:23Z) - TSI-Bench: Benchmarking Time Series Imputation [52.27004336123575]
TSI-Bench is a comprehensive benchmark suite for time series imputation utilizing deep learning techniques.
The TSI-Bench pipeline standardizes experimental settings to enable fair evaluation of imputation algorithms.
TSI-Bench innovatively provides a systematic paradigm to tailor time series forecasting algorithms for imputation purposes.
arXiv Detail & Related papers (2024-06-18T16:07:33Z) - Concrete Dense Network for Long-Sequence Time Series Clustering [4.307648859471193]
Time series clustering is fundamental in data analysis for discovering temporal patterns.
Deep temporal clustering methods have been trying to integrate the canonical k-means into end-to-end training of neural networks.
LoSTer is a novel dense autoencoder architecture for the long-sequence time series clustering problem.
arXiv Detail & Related papers (2024-05-08T12:31:35Z) - A Poisson-Gamma Dynamic Factor Model with Time-Varying Transition Dynamics [51.147876395589925]
A non-stationary PGDS is proposed to allow the underlying transition matrices to evolve over time.
A fully-conjugate and efficient Gibbs sampler is developed to perform posterior simulation.
Experiments show that, in comparison with related models, the proposed non-stationary PGDS achieves improved predictive performance.
arXiv Detail & Related papers (2024-02-26T04:39:01Z) - TimeSiam: A Pre-Training Framework for Siamese Time-Series Modeling [67.02157180089573]
Time series pre-training has recently garnered wide attention for its potential to reduce labeling expenses and benefit various downstream tasks.
This paper proposes TimeSiam as a simple but effective self-supervised pre-training framework for Time series based on Siamese networks.
arXiv Detail & Related papers (2024-02-04T13:10:51Z) - FormerTime: Hierarchical Multi-Scale Representations for Multivariate
Time Series Classification [53.55504611255664]
FormerTime is a hierarchical representation model for improving the classification capacity for the multivariate time series classification task.
It exhibits three aspects of merits: (1) learning hierarchical multi-scale representations from time series data, (2) inheriting the strength of both transformers and convolutional networks, and (3) tacking the efficiency challenges incurred by the self-attention mechanism.
arXiv Detail & Related papers (2023-02-20T07:46:14Z) - Enhancing Multivariate Time Series Classifiers through Self-Attention
and Relative Positioning Infusion [4.18804572788063]
Time Series Classification (TSC) is an important and challenging task for many visual computing applications.
We propose two novel attention blocks that can enhance deep learning-based TSC approaches.
We show that adding the proposed attention blocks improves base models' average accuracy by up to 3.6%.
arXiv Detail & Related papers (2023-02-13T20:50:34Z) - DCSF: Deep Convolutional Set Functions for Classification of
Asynchronous Time Series [5.339109578928972]
Asynchronous Time Series is a time series where all the channels are observed asynchronously-independently.
This paper proposes a novel framework, that is highly scalable and memory efficient, for the asynchronous time series classification task.
We explore convolutional neural networks, which are well researched for the closely related problem-classification of regularly sampled and fully observed time series.
arXiv Detail & Related papers (2022-08-24T08:47:36Z) - Towards Similarity-Aware Time-Series Classification [51.2400839966489]
We study time-series classification (TSC), a fundamental task of time-series data mining.
We propose Similarity-Aware Time-Series Classification (SimTSC), a framework that models similarity information with graph neural networks (GNNs)
arXiv Detail & Related papers (2022-01-05T02:14:57Z) - Learnable Dynamic Temporal Pooling for Time Series Classification [22.931314501371805]
We present a dynamic temporal pooling (DTP) technique that reduces the temporal size of hidden representations by aggregating the features at the segment-level.
For the partition of a whole series into multiple segments, we utilize dynamic time warping (DTW) to align each time point in a temporal order with the prototypical features of the segments.
The DTP layer combined with a fully-connected layer helps to extract further discriminative features considering their temporal position within an input time series.
arXiv Detail & Related papers (2021-04-02T08:58:44Z) - Benchmarking Multivariate Time Series Classification Algorithms [69.12151492736524]
Time Series Classification (TSC) involved building predictive models for a discrete target variable from ordered, real valued, attributes.
Over recent years, a new set of TSC algorithms have been developed which have made significant improvement over the previous state of the art.
We review recently proposed bespoke MTSC algorithms based on deep learning, shapelets and bag of words approaches.
arXiv Detail & Related papers (2020-07-26T15:56:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.