DCSF: Deep Convolutional Set Functions for Classification of
Asynchronous Time Series
- URL: http://arxiv.org/abs/2208.11374v1
- Date: Wed, 24 Aug 2022 08:47:36 GMT
- Title: DCSF: Deep Convolutional Set Functions for Classification of
Asynchronous Time Series
- Authors: Vijaya Krishna Yalavarthi, Johannes Burchert, Lars Schmidt-Thieme
- Abstract summary: Asynchronous Time Series is a time series where all the channels are observed asynchronously-independently.
This paper proposes a novel framework, that is highly scalable and memory efficient, for the asynchronous time series classification task.
We explore convolutional neural networks, which are well researched for the closely related problem-classification of regularly sampled and fully observed time series.
- Score: 5.339109578928972
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Asynchronous Time Series is a multivariate time series where all the channels
are observed asynchronously-independently, making the time series extremely
sparse when aligning them. We often observe this effect in applications with
complex observation processes, such as health care, climate science, and
astronomy, to name a few. Because of the asynchronous nature, they pose a
significant challenge to deep learning architectures, which presume that the
time series presented to them are regularly sampled, fully observed, and
aligned with respect to time. This paper proposes a novel framework, that we
call Deep Convolutional Set Functions (DCSF), which is highly scalable and
memory efficient, for the asynchronous time series classification task. With
the recent advancements in deep set learning architectures, we introduce a
model that is invariant to the order in which time series' channels are
presented to it. We explore convolutional neural networks, which are well
researched for the closely related problem-classification of regularly sampled
and fully observed time series, for encoding the set elements. We evaluate DCSF
for AsTS classification, and online (per time point) AsTS classification. Our
extensive experiments on multiple real-world and synthetic datasets verify that
the suggested model performs substantially better than a range of
state-of-the-art models in terms of accuracy and run time.
Related papers
- Moirai-MoE: Empowering Time Series Foundation Models with Sparse Mixture of Experts [103.725112190618]
This paper introduces Moirai-MoE, using a single input/output projection layer while delegating the modeling of diverse time series patterns to the sparse mixture of experts.
Extensive experiments on 39 datasets demonstrate the superiority of Moirai-MoE over existing foundation models in both in-distribution and zero-shot scenarios.
arXiv Detail & Related papers (2024-10-14T13:01:11Z) - TSCMamba: Mamba Meets Multi-View Learning for Time Series Classification [13.110156202816112]
We propose a novel multi-view approach integrating frequency-domain and time-domain features to provide complementary contexts for time series classification.
Our method fuses continuous wavelet transform spectral features with temporal convolutional or multilayer perceptron features.
Experiments on 10 standard benchmark datasets demonstrate our approach achieves an average 6.45% accuracy improvement over state-of-the-art TSC models.
arXiv Detail & Related papers (2024-06-06T18:05:10Z) - Unified Training of Universal Time Series Forecasting Transformers [104.56318980466742]
We present a Masked-based Universal Time Series Forecasting Transformer (Moirai)
Moirai is trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains.
Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models.
arXiv Detail & Related papers (2024-02-04T20:00:45Z) - TimeSiam: A Pre-Training Framework for Siamese Time-Series Modeling [67.02157180089573]
Time series pre-training has recently garnered wide attention for its potential to reduce labeling expenses and benefit various downstream tasks.
This paper proposes TimeSiam as a simple but effective self-supervised pre-training framework for Time series based on Siamese networks.
arXiv Detail & Related papers (2024-02-04T13:10:51Z) - Learning Gaussian Mixture Representations for Tensor Time Series
Forecasting [8.31607451942671]
We develop a novel TTS forecasting framework, which seeks to individually model each heterogeneity component implied in the time, the location, and the source variables.
Experiment results on two real-world TTS datasets verify the superiority of our approach compared with the state-of-the-art baselines.
arXiv Detail & Related papers (2023-06-01T06:50:47Z) - HyperTime: Implicit Neural Representation for Time Series [131.57172578210256]
Implicit neural representations (INRs) have recently emerged as a powerful tool that provides an accurate and resolution-independent encoding of data.
In this paper, we analyze the representation of time series using INRs, comparing different activation functions in terms of reconstruction accuracy and training convergence speed.
We propose a hypernetwork architecture that leverages INRs to learn a compressed latent representation of an entire time series dataset.
arXiv Detail & Related papers (2022-08-11T14:05:51Z) - Novel Features for Time Series Analysis: A Complex Networks Approach [62.997667081978825]
Time series data are ubiquitous in several domains as climate, economics and health care.
Recent conceptual approach relies on time series mapping to complex networks.
Network analysis can be used to characterize different types of time series.
arXiv Detail & Related papers (2021-10-11T13:46:28Z) - Learnable Dynamic Temporal Pooling for Time Series Classification [22.931314501371805]
We present a dynamic temporal pooling (DTP) technique that reduces the temporal size of hidden representations by aggregating the features at the segment-level.
For the partition of a whole series into multiple segments, we utilize dynamic time warping (DTW) to align each time point in a temporal order with the prototypical features of the segments.
The DTP layer combined with a fully-connected layer helps to extract further discriminative features considering their temporal position within an input time series.
arXiv Detail & Related papers (2021-04-02T08:58:44Z) - Synergetic Learning of Heterogeneous Temporal Sequences for
Multi-Horizon Probabilistic Forecasting [48.8617204809538]
We propose Variational Synergetic Multi-Horizon Network (VSMHN), a novel deep conditional generative model.
To learn complex correlations across heterogeneous sequences, a tailored encoder is devised to combine the advances in deep point processes models and variational recurrent neural networks.
Our model can be trained effectively using variational inference and generates predictions with Monte-Carlo simulation.
arXiv Detail & Related papers (2021-01-31T11:00:55Z) - Multi-Faceted Representation Learning with Hybrid Architecture for Time
Series Classification [16.64345034889185]
We propose a hybrid neural architecture, called Self-Attentive Recurrent Convolutional Networks (SARCoN)
SARCoN is the synthesis of long short-term memory networks with self-attentive mechanisms and Fully Convolutional Networks.
Our work provides a novel angle that deepens the understanding of time series classification, qualifying our proposed model as an ideal choice for real-world applications.
arXiv Detail & Related papers (2020-12-21T16:42:07Z) - A Deep Structural Model for Analyzing Correlated Multivariate Time
Series [11.009809732645888]
We present a deep learning structural time series model which can handle correlated multivariate time series input.
The model explicitly learns/extracts the trend, seasonality, and event components.
We compare our model with several state-of-the-art methods through a comprehensive set of experiments on a variety of time series data sets.
arXiv Detail & Related papers (2020-01-02T18:48:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.