QBSD: Quartile-Based Seasonality Decomposition for Cost-Effective Time
Series Forecasting
- URL: http://arxiv.org/abs/2306.05989v2
- Date: Wed, 16 Aug 2023 14:47:10 GMT
- Title: QBSD: Quartile-Based Seasonality Decomposition for Cost-Effective Time
Series Forecasting
- Authors: Ebenezer RHP Isaac and Bulbul Singh
- Abstract summary: We introduce QBSD, a live forecasting approach tailored to optimize the trade-off between accuracy and computational complexity.
We have evaluated the performance of QBSD against state-of-the-art forecasting approaches on publicly available datasets.
- Score: 0.21756081703275998
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: In the telecom domain, precise forecasting of time series patterns, such as
cell key performance indicators (KPIs), plays a pivotal role in enhancing
service quality and operational efficiency. State-of-the-art forecasting
approaches prioritize forecasting accuracy at the expense of computational
performance, rendering them less suitable for data-intensive applications
encompassing systems with a multitude of time series variables. To address this
issue, we introduce QBSD, a live forecasting approach tailored to optimize the
trade-off between accuracy and computational complexity. We have evaluated the
performance of QBSD against state-of-the-art forecasting approaches on publicly
available datasets. We have also extended this investigation to our curated
network KPI dataset, now publicly accessible, to showcase the effect of dynamic
operating ranges that varies with time. The results demonstrate that the
proposed method excels in runtime efficiency compared to the leading algorithms
available while maintaining competitive forecast accuracy.
Related papers
- Accurate and Efficient Multivariate Time Series Forecasting via Offline Clustering [22.545533166145706]
We introduce the Forecaster with Offline Clustering Using Segments (FOCUS)<n>FOCUS is a novel approach to MTS forecasting that simplifies long-range dependency modeling.<n>It achieves state-of-the-art accuracy while significantly reducing computational costs.
arXiv Detail & Related papers (2025-05-09T02:34:06Z) - Federated Dynamic Modeling and Learning for Spatiotemporal Data Forecasting [0.8568432695376288]
This paper presents an advanced Federated Learning (FL) framework for forecasting complextemporal data, improving upon recent state-of-the-art models.
The resulting architecture significantly improves the model's capacity to handle complex temporal patterns in diverse forecasting applications.
The efficiency of our approach is demonstrated through extensive experiments on real-world applications, including public datasets for multimodal transport demand forecasting and private datasets for Origin-Destination (OD) matrix forecasting in urban areas.
arXiv Detail & Related papers (2025-03-06T15:16:57Z) - Value-Based Deep RL Scales Predictably [100.21834069400023]
We show that value-based off-policy RL methods are predictable despite community lore regarding their pathological behavior.
We validate our approach using three algorithms: SAC, BRO, and PQL on DeepMind Control, OpenAI gym, and IsaacGym.
arXiv Detail & Related papers (2025-02-06T18:59:47Z) - Neural Conformal Control for Time Series Forecasting [54.96087475179419]
We introduce a neural network conformal prediction method for time series that enhances adaptivity in non-stationary environments.
Our approach acts as a neural controller designed to achieve desired target coverage, leveraging auxiliary multi-view data with neural network encoders.
We empirically demonstrate significant improvements in coverage and probabilistic accuracy, and find that our method is the only one that combines good calibration with consistency in prediction intervals.
arXiv Detail & Related papers (2024-12-24T03:56:25Z) - Optimal starting point for time series forecasting [1.9937737230710553]
We introduce a novel approach called Optimal Starting Point Time Series Forecast (OSP-TSP)
By adjusting the sequence length via leveraging the XGBoost and LightGBM models, the proposed approach can determine optimal starting point (OSP) of the time series.
Empirical results indicate that predictions based on the OSP-TSP approach consistently outperform those using the complete dataset.
arXiv Detail & Related papers (2024-09-25T11:51:00Z) - Enhancing Microgrid Performance Prediction with Attention-based Deep Learning Models [0.0]
This research aims to address microgrid systems' operational challenges, characterized by power oscillations that contribute to grid instability.
An integrated strategy is proposed, leveraging the strengths of convolutional and Gated Recurrent Unit (GRU) layers.
The framework is anchored by a Multi-Layer Perceptron (MLP) model, which is tasked with comprehensive load forecasting.
arXiv Detail & Related papers (2024-07-20T21:24:11Z) - Stratified Prediction-Powered Inference for Hybrid Language Model Evaluation [62.2436697657307]
Prediction-powered inference (PPI) is a method that improves statistical estimates based on limited human-labeled data.
We propose a method called Stratified Prediction-Powered Inference (StratPPI)
We show that the basic PPI estimates can be considerably improved by employing simple data stratification strategies.
arXiv Detail & Related papers (2024-06-06T17:37:39Z) - Share Your Secrets for Privacy! Confidential Forecasting with Vertical Federated Learning [5.584904689846748]
Key challenges to address in manufacturing include data privacy and over-fitting on small and noisy datasets.
We propose 'Secret-shared Time Series Forecasting with VFL', a novel framework that exhibits the following key features.
Our results demonstrate that STV's forecasting accuracy is comparable to those of centralized approaches.
arXiv Detail & Related papers (2024-05-31T12:27:38Z) - Loss Shaping Constraints for Long-Term Time Series Forecasting [79.3533114027664]
We present a Constrained Learning approach for long-term time series forecasting that respects a user-defined upper bound on the loss at each time-step.
We propose a practical Primal-Dual algorithm to tackle it, and aims to demonstrate that it exhibits competitive average performance in time series benchmarks, while shaping the errors across the predicted window.
arXiv Detail & Related papers (2024-02-14T18:20:44Z) - A Meta-Learning Approach to Predicting Performance and Data Requirements [163.4412093478316]
We propose an approach to estimate the number of samples required for a model to reach a target performance.
We find that the power law, the de facto principle to estimate model performance, leads to large error when using a small dataset.
We introduce a novel piecewise power law (PPL) that handles the two data differently.
arXiv Detail & Related papers (2023-03-02T21:48:22Z) - Low Complexity Adaptive Machine Learning Approaches for End-to-End
Latency Prediction [0.0]
This work is the design of efficient, low-cost adaptive algorithms for estimation, monitoring and prediction.
We focus on end-to-end latency prediction, for which we illustrate our approaches and results on data obtained from a public generator provided after the recent international challenge on GNN.
arXiv Detail & Related papers (2023-01-31T10:29:11Z) - Grouped self-attention mechanism for a memory-efficient Transformer [64.0125322353281]
Real-world tasks such as forecasting weather, electricity consumption, and stock market involve predicting data that vary over time.
Time-series data are generally recorded over a long period of observation with long sequences owing to their periodic characteristics and long-range dependencies over time.
We propose two novel modules, Grouped Self-Attention (GSA) and Compressed Cross-Attention (CCA)
Our proposed model efficiently exhibited reduced computational complexity and performance comparable to or better than existing methods.
arXiv Detail & Related papers (2022-10-02T06:58:49Z) - Optimizing for the Future in Non-Stationary MDPs [52.373873622008944]
We present a policy gradient algorithm that maximizes a forecast of future performance.
We show that our algorithm, called Prognosticator, is more robust to non-stationarity than two online adaptation techniques.
arXiv Detail & Related papers (2020-05-17T03:41:19Z) - Deep Echo State Networks for Short-Term Traffic Forecasting: Performance
Comparison and Statistical Assessment [8.586891288891263]
In short-term traffic forecasting, the goal is to accurately predict future values of a traffic parameter of interest.
Deep Echo State Networks achieve more accurate traffic forecasts than the rest of considered modeling counterparts.
arXiv Detail & Related papers (2020-04-17T11:07:25Z) - Transformer Hawkes Process [79.16290557505211]
We propose a Transformer Hawkes Process (THP) model, which leverages the self-attention mechanism to capture long-term dependencies.
THP outperforms existing models in terms of both likelihood and event prediction accuracy by a notable margin.
We provide a concrete example, where THP achieves improved prediction performance for learning multiple point processes when incorporating their relational information.
arXiv Detail & Related papers (2020-02-21T13:48:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.