Ego-Network Transformer for Subsequence Classification in Time Series
Data
- URL: http://arxiv.org/abs/2311.02561v1
- Date: Sun, 5 Nov 2023 04:21:42 GMT
- Title: Ego-Network Transformer for Subsequence Classification in Time Series
Data
- Authors: Chin-Chia Michael Yeh, Huiyuan Chen, Yujie Fan, Xin Dai, Yan Zheng,
Vivian Lai, Junpeng Wang, Zhongfang Zhuang, Liang Wang, Wei Zhang, Eamonn
Keogh
- Abstract summary: Real-world time series data often contain foreground subsequences intertwined with background subsequences.
We propose a novel subsequence classification method that represents each subsequence as an ego-network.
Our method outperforms the baseline on 104 out of 158 datasets.
- Score: 36.591480151951515
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Time series classification is a widely studied problem in the field of time
series data mining. Previous research has predominantly focused on scenarios
where relevant or foreground subsequences have already been extracted, with
each subsequence corresponding to a single label. However, real-world time
series data often contain foreground subsequences that are intertwined with
background subsequences. Successfully classifying these relevant subsequences
requires not only distinguishing between different classes but also accurately
identifying the foreground subsequences amidst the background. To address this
challenge, we propose a novel subsequence classification method that represents
each subsequence as an ego-network, providing crucial nearest neighbor
information to the model. The ego-networks of all subsequences collectively
form a time series subsequence graph, and we introduce an algorithm to
efficiently construct this graph. Furthermore, we have demonstrated the
significance of enforcing temporal consistency in the prediction of adjacent
subsequences for the subsequence classification problem. To evaluate the
effectiveness of our approach, we conducted experiments using 128 univariate
and 30 multivariate time series datasets. The experimental results demonstrate
the superior performance of our method compared to alternative approaches.
Specifically, our method outperforms the baseline on 104 out of 158 datasets.
Related papers
- Does It Look Sequential? An Analysis of Datasets for Evaluation of Sequential Recommendations [0.8437187555622164]
Sequential recommender systems aim to use the order of interactions in a user's history to predict future interactions.
It is crucial to use datasets that exhibit a sequential structure to evaluate sequential recommenders properly.
We apply several methods based on the random shuffling of the user's sequence of interactions to assess the strength of sequential structure across 15 datasets.
arXiv Detail & Related papers (2024-08-21T21:40:07Z) - Temporally Grounding Instructional Diagrams in Unconstrained Videos [51.85805768507356]
We study the challenging problem of simultaneously localizing a sequence of queries in instructional diagrams in a video.
Most existing methods focus on grounding one query at a time, ignoring the inherent structures among queries.
We propose composite queries constructed by exhaustively pairing up the visual content features of the step diagrams.
We demonstrate the effectiveness of our approach on the IAW dataset for grounding step diagrams and the YouCook2 benchmark for grounding natural language queries.
arXiv Detail & Related papers (2024-07-16T05:44:30Z) - Capturing Temporal Components for Time Series Classification [5.70772577110828]
This work introduces a textitcompositional representation learning approach trained on statistically coherent components extracted from sequential data.
Based on a multi-scale change space, an unsupervised approach is proposed to segment the sequential data into chunks with similar statistical properties.
A sequence-based encoder model is trained in a multi-task setting to learn compositional representations from these temporal components for time series classification.
arXiv Detail & Related papers (2024-06-20T16:15:21Z) - TimeSiam: A Pre-Training Framework for Siamese Time-Series Modeling [67.02157180089573]
Time series pre-training has recently garnered wide attention for its potential to reduce labeling expenses and benefit various downstream tasks.
This paper proposes TimeSiam as a simple but effective self-supervised pre-training framework for Time series based on Siamese networks.
arXiv Detail & Related papers (2024-02-04T13:10:51Z) - Seq-HyGAN: Sequence Classification via Hypergraph Attention Network [0.0]
Sequence classification has a wide range of real-world applications in different domains, such as genome classification in health and anomaly detection in business.
The lack of explicit features in sequence data makes it difficult for machine learning models.
We propose a novel Hypergraph Attention Network model, namely Seq-HyGAN.
arXiv Detail & Related papers (2023-03-04T11:53:33Z) - Uniform Sequence Better: Time Interval Aware Data Augmentation for
Sequential Recommendation [16.00020821220671]
Sequential recommendation is an important task to predict the next-item to access based on a sequence of items.
Most existing works learn user preference as the transition pattern from the previous item to the next one, ignoring the time interval between these two items.
We propose to augment sequence data from the perspective of time interval, which is not studied in the literature.
arXiv Detail & Related papers (2022-12-16T03:13:43Z) - COSTI: a New Classifier for Sequences of Temporal Intervals [0.0]
We develop a novel method for classification operating directly on sequences of temporal intervals.
The proposed method remains at a high level of accuracy and obtains better performance while avoiding shortcomings connected to operating on transformed data.
arXiv Detail & Related papers (2022-04-28T12:55:06Z) - Cluster-and-Conquer: A Framework For Time-Series Forecasting [94.63501563413725]
We propose a three-stage framework for forecasting high-dimensional time-series data.
Our framework is highly general, allowing for any time-series forecasting and clustering method to be used in each step.
When instantiated with simple linear autoregressive models, we are able to achieve state-of-the-art results on several benchmark datasets.
arXiv Detail & Related papers (2021-10-26T20:41:19Z) - Novel Features for Time Series Analysis: A Complex Networks Approach [62.997667081978825]
Time series data are ubiquitous in several domains as climate, economics and health care.
Recent conceptual approach relies on time series mapping to complex networks.
Network analysis can be used to characterize different types of time series.
arXiv Detail & Related papers (2021-10-11T13:46:28Z) - Parallel Attention Network with Sequence Matching for Video Grounding [56.649826885121264]
Given a video, video grounding aims to retrieve a temporal moment that semantically corresponds to a language query.
We propose a Parallel Attention Network with Sequence matching (SeqPAN) to address the challenges in this task.
arXiv Detail & Related papers (2021-05-18T12:43:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.