High-dimensional Multivariate Time Series Forecasting in IoT
Applications using Embedding Non-stationary Fuzzy Time Series
- URL: http://arxiv.org/abs/2107.09785v1
- Date: Tue, 20 Jul 2021 22:00:43 GMT
- Title: High-dimensional Multivariate Time Series Forecasting in IoT
Applications using Embedding Non-stationary Fuzzy Time Series
- Authors: Hugo Vinicius Bitencourt and Frederico Gadelha Guimar\~aes
- Abstract summary: Fuzzy Time Series (FTS) models stand out as data-driven non-parametric models of easy implementation and high accuracy.
We present a new approach to handle high-dimensional non-stationary time series, by projecting the original high-dimensional data into a low dimensional embedding space.
Our model is able to explain 98% of the variance and reach 11.52% of RMSE, 2.68% of MAE and 2.91% of MAPE.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: In Internet of things (IoT), data is continuously recorded from different
data sources and devices can suffer faults in their embedded electronics, thus
leading to a high-dimensional data sets and concept drift events. Therefore,
methods that are capable of high-dimensional non-stationary time series are of
great value in IoT applications. Fuzzy Time Series (FTS) models stand out as
data-driven non-parametric models of easy implementation and high accuracy.
Unfortunately, FTS encounters difficulties when dealing with data sets of many
variables and scenarios with concept drift. We present a new approach to handle
high-dimensional non-stationary time series, by projecting the original
high-dimensional data into a low dimensional embedding space and using FTS
approach. Combining these techniques enables a better representation of the
complex content of non-stationary multivariate time series and accurate
forecasts. Our model is able to explain 98% of the variance and reach 11.52% of
RMSE, 2.68% of MAE and 2.91% of MAPE.
Related papers
- TimeSieve: Extracting Temporal Dynamics through Information Bottlenecks [31.10683149519954]
We propose an innovative time series forecasting model TimeSieve.
Our approach employs wavelet transforms to preprocess time series data, effectively capturing multi-scale features.
Our results validate the effectiveness of our approach in addressing the key challenges in time series forecasting.
arXiv Detail & Related papers (2024-06-07T15:58:12Z) - Unified Training of Universal Time Series Forecasting Transformers [104.56318980466742]
We present a Masked-based Universal Time Series Forecasting Transformer (Moirai)
Moirai is trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains.
Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models.
arXiv Detail & Related papers (2024-02-04T20:00:45Z) - Timer: Generative Pre-trained Transformers Are Large Time Series Models [83.03091523806668]
This paper aims at the early development of large time series models (LTSM)
During pre-training, we curate large-scale datasets with up to 1 billion time points.
To meet diverse application needs, we convert forecasting, imputation, and anomaly detection of time series into a unified generative task.
arXiv Detail & Related papers (2024-02-04T06:55:55Z) - EdgeConvFormer: Dynamic Graph CNN and Transformer based Anomaly
Detection in Multivariate Time Series [7.514010315664322]
We propose a novel anomaly detection method, named EdgeConvFormer, which integrates stacked Time2vec embedding, dynamic graph CNN, and Transformer to extract global and local spatial-time information.
Experiments demonstrate that EdgeConvFormer can learn the spatial-temporal modeling from multivariate time series data and achieve better anomaly detection performance than the state-of-the-art approaches on many real-world datasets of different scales.
arXiv Detail & Related papers (2023-12-04T08:38:54Z) - Multi-scale Transformer Pyramid Networks for Multivariate Time Series
Forecasting [8.739572744117634]
We introduce a dimension invariant embedding technique that captures short-term temporal dependencies.
We present a novel Multi-scale Transformer Pyramid Network (MTPNet) specifically designed to capture temporal dependencies at multiple unconstrained scales.
arXiv Detail & Related papers (2023-08-23T06:40:05Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Combining Embeddings and Fuzzy Time Series for High-Dimensional Time
Series Forecasting in Internet of Energy Applications [0.0]
Fuzzy Time Series (FTS) models stand out as data-driven non-parametric models of easy implementation and high accuracy.
We present a new methodology for handling high-dimensional time series, by projecting the original high-dimensional data into a low dimensional embedding space.
arXiv Detail & Related papers (2021-12-03T19:50:09Z) - Dynamic Network-Assisted D2D-Aided Coded Distributed Learning [59.29409589861241]
We propose a novel device-to-device (D2D)-aided coded federated learning method (D2D-CFL) for load balancing across devices.
We derive an optimal compression rate for achieving minimum processing time and establish its connection with the convergence time.
Our proposed method is beneficial for real-time collaborative applications, where the users continuously generate training data.
arXiv Detail & Related papers (2021-11-26T18:44:59Z) - Deep Cellular Recurrent Network for Efficient Analysis of Time-Series
Data with Spatial Information [52.635997570873194]
This work proposes a novel deep cellular recurrent neural network (DCRNN) architecture to process complex multi-dimensional time series data with spatial information.
The proposed architecture achieves state-of-the-art performance while utilizing substantially less trainable parameters when compared to comparable methods in the literature.
arXiv Detail & Related papers (2021-01-12T20:08:18Z) - Transformer Hawkes Process [79.16290557505211]
We propose a Transformer Hawkes Process (THP) model, which leverages the self-attention mechanism to capture long-term dependencies.
THP outperforms existing models in terms of both likelihood and event prediction accuracy by a notable margin.
We provide a concrete example, where THP achieves improved prediction performance for learning multiple point processes when incorporating their relational information.
arXiv Detail & Related papers (2020-02-21T13:48:13Z) - Multivariate Probabilistic Time Series Forecasting via Conditioned
Normalizing Flows [8.859284959951204]
Time series forecasting is fundamental to scientific and engineering problems.
Deep learning methods are well suited for this problem.
We show that it improves over the state-of-the-art for standard metrics on many real-world data sets.
arXiv Detail & Related papers (2020-02-14T16:16:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.