Quantized symbolic time series approximation
- URL: http://arxiv.org/abs/2411.15209v1
- Date: Wed, 20 Nov 2024 10:32:22 GMT
- Title: Quantized symbolic time series approximation
- Authors: Erin Carson, Xinye Chen, Cheng Kang,
- Abstract summary: We present a new quantization-based ABBA symbolic approximation technique, QABBA.
QABBA exhibits improved storage efficiency while retaining the original speed and accuracy of symbolic reconstruction.
An application of QABBA with large language models (LLMs) for time series regression is also presented.
- Score: 0.28675177318965045
- License:
- Abstract: Time series are ubiquitous in numerous science and engineering domains, e.g., signal processing, bioinformatics, and astronomy. Previous work has verified the efficacy of symbolic time series representation in a variety of engineering applications due to its storage efficiency and numerosity reduction. The most recent symbolic aggregate approximation technique, ABBA, has been shown to preserve essential shape information of time series and improve downstream applications, e.g., neural network inference regarding prediction and anomaly detection in time series. Motivated by the emergence of high-performance hardware which enables efficient computation for low bit-width representations, we present a new quantization-based ABBA symbolic approximation technique, QABBA, which exhibits improved storage efficiency while retaining the original speed and accuracy of symbolic reconstruction. We prove an upper bound for the error arising from quantization and discuss how the number of bits should be chosen to balance this with other errors. An application of QABBA with large language models (LLMs) for time series regression is also presented, and its utility is investigated. By representing the symbolic chain of patterns on time series, QABBA not only avoids the training of embedding from scratch, but also achieves a new state-of-the-art on Monash regression dataset. The symbolic approximation to the time series offers a more efficient way to fine-tune LLMs on the time series regression task which contains various application domains. We further present a set of extensive experiments performed across various well-established datasets to demonstrate the advantages of the QABBA method for symbolic approximation.
Related papers
- Unified Training of Universal Time Series Forecasting Transformers [104.56318980466742]
We present a Masked-based Universal Time Series Forecasting Transformer (Moirai)
Moirai is trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains.
Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models.
arXiv Detail & Related papers (2024-02-04T20:00:45Z) - TimeSiam: A Pre-Training Framework for Siamese Time-Series Modeling [67.02157180089573]
Time series pre-training has recently garnered wide attention for its potential to reduce labeling expenses and benefit various downstream tasks.
This paper proposes TimeSiam as a simple but effective self-supervised pre-training framework for Time series based on Siamese networks.
arXiv Detail & Related papers (2024-02-04T13:10:51Z) - Graph Spatiotemporal Process for Multivariate Time Series Anomaly
Detection with Missing Values [67.76168547245237]
We introduce a novel framework called GST-Pro, which utilizes a graphtemporal process and anomaly scorer to detect anomalies.
Our experimental results show that the GST-Pro method can effectively detect anomalies in time series data and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-01-11T10:10:16Z) - State Sequences Prediction via Fourier Transform for Representation
Learning [111.82376793413746]
We propose State Sequences Prediction via Fourier Transform (SPF), a novel method for learning expressive representations efficiently.
We theoretically analyze the existence of structural information in state sequences, which is closely related to policy performance and signal regularity.
Experiments demonstrate that the proposed method outperforms several state-of-the-art algorithms in terms of both sample efficiency and performance.
arXiv Detail & Related papers (2023-10-24T14:47:02Z) - HyperTime: Implicit Neural Representation for Time Series [131.57172578210256]
Implicit neural representations (INRs) have recently emerged as a powerful tool that provides an accurate and resolution-independent encoding of data.
In this paper, we analyze the representation of time series using INRs, comparing different activation functions in terms of reconstruction accuracy and training convergence speed.
We propose a hypernetwork architecture that leverages INRs to learn a compressed latent representation of an entire time series dataset.
arXiv Detail & Related papers (2022-08-11T14:05:51Z) - An efficient aggregation method for the symbolic representation of
temporal data [0.0]
We present a new variant of the adaptive Brownian bridge-based aggregation (ABBA) method, called fABBA.
This variant utilizes a new aggregation approach tailored to the piecewise representation of time series.
In contrast to the original method, the new approach does not require the number of time series symbols to be specified in advance.
arXiv Detail & Related papers (2022-01-14T22:51:24Z) - Novel Features for Time Series Analysis: A Complex Networks Approach [62.997667081978825]
Time series data are ubiquitous in several domains as climate, economics and health care.
Recent conceptual approach relies on time series mapping to complex networks.
Network analysis can be used to characterize different types of time series.
arXiv Detail & Related papers (2021-10-11T13:46:28Z) - MrSQM: Fast Time Series Classification with Symbolic Representations [11.853438514668207]
MrSQM uses multiple symbolic representations and efficient sequence mining to extract important time series features.
We study four feature selection approaches on symbolic sequences, ranging from fully supervised, to unsupervised and hybrids.
Our experiments on 112 datasets of the UEA/UCR benchmark demonstrate that MrSQM can quickly extract useful features.
arXiv Detail & Related papers (2021-09-02T15:54:46Z) - ABBA: Adaptive Brownian bridge-based symbolic aggregation of time series [0.0]
A new symbolic representation of time called ABBA is introduced.
It is based on an adaptive polygonal chain approximation of the time series into a sequence of seriess.
We show that the reconstruction error of this representation can be modelled as a random walk with pinned start and end points.
arXiv Detail & Related papers (2020-03-27T15:30:32Z) - Time Series Forecasting Using LSTM Networks: A Symbolic Approach [0.0]
A combination of a recurrent neural network with a dimension-reducing symbolic representation is proposed and applied for the purpose of time series forecasting.
It is shown that the symbolic representation can help to alleviate some of the aforementioned problems and, in addition, might allow for faster training without sacrificing the forecast performance.
arXiv Detail & Related papers (2020-03-12T09:18:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.