Evaluating Short-Term Forecasting of Multiple Time Series in IoT
Environments
- URL: http://arxiv.org/abs/2206.07784v1
- Date: Wed, 15 Jun 2022 19:46:59 GMT
- Title: Evaluating Short-Term Forecasting of Multiple Time Series in IoT
Environments
- Authors: Christos Tzagkarakis, Pavlos Charalampidis, Stylianos Roubakis,
Alexandros Fragkiadakis, Sotiris Ioannidis
- Abstract summary: Internet of Things (IoT) environments are monitored via a large number of IoT enabled sensing devices.
To alleviate this issue, sensors are often configured to operate at relatively low sampling frequencies.
This can hamper dramatically subsequent decision-making, such as forecasting.
- Score: 67.24598072875744
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Modern Internet of Things (IoT) environments are monitored via a large number
of IoT enabled sensing devices, with the data acquisition and processing
infrastructure setting restrictions in terms of computational power and energy
resources. To alleviate this issue, sensors are often configured to operate at
relatively low sampling frequencies, yielding a reduced set of observations.
Nevertheless, this can hamper dramatically subsequent decision-making, such as
forecasting. To address this problem, in this work we evaluate short-term
forecasting in highly underdetermined cases, i.e., the number of sensor streams
is much higher than the number of observations. Several statistical, machine
learning and neural network-based models are thoroughly examined with respect
to the resulting forecasting accuracy on five different real-world datasets.
The focus is given on a unified experimental protocol especially designed for
short-term prediction of multiple time series at the IoT edge. The proposed
framework can be considered as an important step towards establishing a solid
forecasting strategy in resource constrained IoT applications.
Related papers
- SFANet: Spatial-Frequency Attention Network for Weather Forecasting [54.470205739015434]
Weather forecasting plays a critical role in various sectors, driving decision-making and risk management.
Traditional methods often struggle to capture the complex dynamics of meteorological systems.
We propose a novel framework designed to address these challenges and enhance the accuracy of weather prediction.
arXiv Detail & Related papers (2024-05-29T08:00:15Z) - Spatially-resolved hyperlocal weather prediction and anomaly detection
using IoT sensor networks and machine learning techniques [0.0]
We propose a novel approach that combines hyperlocal weather prediction and anomaly detection using IoT sensor networks and machine learning techniques.
Our system is able to enhance the spatial resolution of predictions and effectively detect anomalies in real-time.
Our findings indicate that this system has the potential to enhance decision-making.
arXiv Detail & Related papers (2023-10-17T05:04:53Z) - Multi-Dimensional Self Attention based Approach for Remaining Useful
Life Estimation [0.17205106391379021]
Remaining Useful Life (RUL) estimation plays a critical role in Prognostics and Health Management (PHM)
This paper carries out research into the remaining useful life prediction model for multi-sensor devices in the IIoT scenario.
A data-driven approach for RUL estimation is proposed in this paper.
arXiv Detail & Related papers (2022-12-12T08:50:27Z) - Inference Latency Prediction at the Edge [0.3974789827371669]
State-of-the-art neural architectures (NAs) are typically designed through Neural Architecture Search (NAS) to identify NAs with good tradeoffs between accuracy and efficiency.
Since measuring the latency of a huge set of candidate architectures during NAS is not scalable, approaches are needed for predicting end-to-end inference latency on mobile devices.
We propose a latency prediction framework which addresses these challenges by developing operation-wise latency predictors.
arXiv Detail & Related papers (2022-10-06T00:46:06Z) - Time-to-Green predictions for fully-actuated signal control systems with
supervised learning [56.66331540599836]
This paper proposes a time series prediction framework using aggregated traffic signal and loop detector data.
We utilize state-of-the-art machine learning models to predict future signal phases' duration.
Results based on an empirical data set from a fully-actuated signal control system in Zurich, Switzerland, show that machine learning models outperform conventional prediction methods.
arXiv Detail & Related papers (2022-08-24T07:50:43Z) - Probabilistic AutoRegressive Neural Networks for Accurate Long-range
Forecasting [6.295157260756792]
We introduce the Probabilistic AutoRegressive Neural Networks (PARNN)
PARNN is capable of handling complex time series data exhibiting non-stationarity, nonlinearity, non-seasonality, long-range dependence, and chaotic patterns.
We evaluate the performance of PARNN against standard statistical, machine learning, and deep learning models, including Transformers, NBeats, and DeepAR.
arXiv Detail & Related papers (2022-04-01T17:57:36Z) - Multi-head Temporal Attention-Augmented Bilinear Network for Financial
time series prediction [77.57991021445959]
We propose a neural layer based on the ideas of temporal attention and multi-head attention to extend the capability of the underlying neural network.
The effectiveness of our approach is validated using large-scale limit-order book market data.
arXiv Detail & Related papers (2022-01-14T14:02:19Z) - Energy Aware Deep Reinforcement Learning Scheduling for Sensors
Correlated in Time and Space [62.39318039798564]
We propose a scheduling mechanism capable of taking advantage of correlated information.
The proposed mechanism is capable of determining the frequency with which sensors should transmit their updates.
We show that our solution can significantly extend the sensors' lifetime.
arXiv Detail & Related papers (2020-11-19T09:53:27Z) - Deep Anomaly Detection for Time-series Data in Industrial IoT: A
Communication-Efficient On-device Federated Learning Approach [40.992167455141946]
This paper proposes a new communication-efficient on-device federated learning (FL)-based deep anomaly detection framework for sensing time-series data in IIoT.
We first introduce a FL framework to enable decentralized edge devices to collaboratively train an anomaly detection model, which can improve its generalization ability.
Second, we propose an Attention Mechanism-based Convolutional Neural Network-Long Short Term Memory (AMCNN-LSTM) model to accurately detect anomalies.
Third, to adapt the proposed framework to the timeliness of industrial anomaly detection, we propose a gradient compression mechanism based on Top-textitk selection to
arXiv Detail & Related papers (2020-07-19T16:47:26Z) - Adaptive Anomaly Detection for IoT Data in Hierarchical Edge Computing [71.86955275376604]
We propose an adaptive anomaly detection approach for hierarchical edge computing (HEC) systems to solve this problem.
We design an adaptive scheme to select one of the models based on the contextual information extracted from input data, to perform anomaly detection.
We evaluate our proposed approach using a real IoT dataset, and demonstrate that it reduces detection delay by 84% while maintaining almost the same accuracy as compared to offloading detection tasks to the cloud.
arXiv Detail & Related papers (2020-01-10T05:29:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.