Tripletformer for Probabilistic Interpolation of Irregularly sampled
Time Series
- URL: http://arxiv.org/abs/2210.02091v2
- Date: Fri, 12 Jan 2024 13:43:18 GMT
- Title: Tripletformer for Probabilistic Interpolation of Irregularly sampled
Time Series
- Authors: Vijaya Krishna Yalavarthi, Johannes Burchert, Lars Schmidt-thieme
- Abstract summary: We present a novel encoder-decoder architecture called "Tripletformer" for probabilistic of irregularly sampled time series with missing values.
This attention-based model operates on sets of observations, where each element is composed of a triple time, channel, and value.
Results indicate an improvement in negative loglikelihood error by up to 32% on real-world datasets and 85% on synthetic datasets.
- Score: 6.579888565581481
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Irregularly sampled time series data with missing values is observed in many
fields like healthcare, astronomy, and climate science. Interpolation of these
types of time series is crucial for tasks such as root cause analysis and
medical diagnosis, as well as for smoothing out irregular or noisy data. To
address this challenge, we present a novel encoder-decoder architecture called
"Tripletformer" for probabilistic interpolation of irregularly sampled time
series with missing values. This attention-based model operates on sets of
observations, where each element is composed of a triple of time, channel, and
value. The encoder and decoder of the Tripletformer are designed with attention
layers and fully connected layers, enabling the model to effectively process
the presented set elements. We evaluate the Tripletformer against a range of
baselines on multiple real-world and synthetic datasets and show that it
produces more accurate and certain interpolations. Results indicate an
improvement in negative loglikelihood error by up to 32% on real-world datasets
and 85% on synthetic datasets when using the Tripletformer compared to the next
best model.
Related papers
- Graph Spatiotemporal Process for Multivariate Time Series Anomaly
Detection with Missing Values [67.76168547245237]
We introduce a novel framework called GST-Pro, which utilizes a graphtemporal process and anomaly scorer to detect anomalies.
Our experimental results show that the GST-Pro method can effectively detect anomalies in time series data and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-01-11T10:10:16Z) - TS-Diffusion: Generating Highly Complex Time Series with Diffusion
Models [12.646560434352478]
We consider a class of time series with three common bad properties, including sampling irregularities, missingness, and large feature-temporal dimensions.
We introduce a general model, TS-Diffusion, to process such complex time series.
We have conducted extensive experiments on multiple time-series datasets, demonstrating that TS-Diffusion achieves excellent results on both conventional and complex time series.
arXiv Detail & Related papers (2023-11-06T17:52:08Z) - Compatible Transformer for Irregularly Sampled Multivariate Time Series [75.79309862085303]
We propose a transformer-based encoder to achieve comprehensive temporal-interaction feature learning for each individual sample.
We conduct extensive experiments on 3 real-world datasets and validate that the proposed CoFormer significantly and consistently outperforms existing methods.
arXiv Detail & Related papers (2023-10-17T06:29:09Z) - AnomalyBERT: Self-Supervised Transformer for Time Series Anomaly
Detection using Data Degradation Scheme [0.7216399430290167]
Anomaly detection task for time series, especially for unlabeled data, has been a challenging problem.
We address it by applying a suitable data degradation scheme to self-supervised model training.
Inspired by the self-attention mechanism, we design a Transformer-based architecture to recognize the temporal context.
arXiv Detail & Related papers (2023-05-08T05:42:24Z) - TSI-GAN: Unsupervised Time Series Anomaly Detection using Convolutional
Cycle-Consistent Generative Adversarial Networks [2.4469484645516837]
Anomaly detection is widely used in network intrusion detection, autonomous driving, medical diagnosis, credit card frauds, etc.
This paper proposes TSI-GAN, an unsupervised anomaly detection model for time-series that can learn complex temporal patterns automatically.
We evaluate TSI-GAN using 250 well-curated and harder-than-usual datasets and compare with 8 state-of-the-art baseline methods.
arXiv Detail & Related papers (2023-03-22T23:24:47Z) - Generative Time Series Forecasting with Diffusion, Denoise, and
Disentanglement [51.55157852647306]
Time series forecasting has been a widely explored task of great importance in many applications.
It is common that real-world time series data are recorded in a short time period, which results in a big gap between the deep model and the limited and noisy time series.
We propose to address the time series forecasting problem with generative modeling and propose a bidirectional variational auto-encoder equipped with diffusion, denoise, and disentanglement.
arXiv Detail & Related papers (2023-01-08T12:20:46Z) - DynImp: Dynamic Imputation for Wearable Sensing Data Through Sensory and
Temporal Relatedness [78.98998551326812]
We argue that traditional methods have rarely made use of both times-series dynamics of the data as well as the relatedness of the features from different sensors.
We propose a model, termed as DynImp, to handle different time point's missingness with nearest neighbors along feature axis.
We show that the method can exploit the multi-modality features from related sensors and also learn from history time-series dynamics to reconstruct the data under extreme missingness.
arXiv Detail & Related papers (2022-09-26T21:59:14Z) - HyperTime: Implicit Neural Representation for Time Series [131.57172578210256]
Implicit neural representations (INRs) have recently emerged as a powerful tool that provides an accurate and resolution-independent encoding of data.
In this paper, we analyze the representation of time series using INRs, comparing different activation functions in terms of reconstruction accuracy and training convergence speed.
We propose a hypernetwork architecture that leverages INRs to learn a compressed latent representation of an entire time series dataset.
arXiv Detail & Related papers (2022-08-11T14:05:51Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - Imputing Missing Observations with Time Sliced Synthetic Minority
Oversampling Technique [0.3973560285628012]
We present a simple yet novel time series imputation technique with the goal of constructing an irregular time series that is uniform across every sample in a data set.
We fix a grid defined by the midpoints of non-overlapping bins (dubbed "slices") of observation times and ensure that each sample has values for all of the features at that given time.
This allows one to both impute fully missing observations to allow uniform time series classification across the entire data and, in special cases, to impute individually missing features.
arXiv Detail & Related papers (2022-01-14T19:23:24Z) - Stacking VAE with Graph Neural Networks for Effective and Interpretable
Time Series Anomaly Detection [5.935707085640394]
We propose a stacking variational auto-encoder (VAE) model with graph neural networks for the effective and interpretable time-series anomaly detection.
We show that our proposed model outperforms the strong baselines on three public datasets with considerable improvements.
arXiv Detail & Related papers (2021-05-18T09:50:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.