WaveBound: Dynamic Error Bounds for Stable Time Series Forecasting
- URL: http://arxiv.org/abs/2210.14303v2
- Date: Sun, 28 May 2023 09:55:32 GMT
- Title: WaveBound: Dynamic Error Bounds for Stable Time Series Forecasting
- Authors: Youngin Cho, Daejin Kim, Dongmin Kim, Mohammad Azam Khan, Jaegul Choo
- Abstract summary: Time series forecasting has become a critical task due to its high practicality in real-world applications.
Recent deep-learning-based approaches have shown remarkable success in time series forecasting.
Deep networks still suffer from unstable training and overfitting.
- Score: 30.692056599222926
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Time series forecasting has become a critical task due to its high
practicality in real-world applications such as traffic, energy consumption,
economics and finance, and disease analysis. Recent deep-learning-based
approaches have shown remarkable success in time series forecasting.
Nonetheless, due to the dynamics of time series data, deep networks still
suffer from unstable training and overfitting. Inconsistent patterns appearing
in real-world data lead the model to be biased to a particular pattern, thus
limiting the generalization. In this work, we introduce the dynamic error
bounds on training loss to address the overfitting issue in time series
forecasting. Consequently, we propose a regularization method called WaveBound
which estimates the adequate error bounds of training loss for each time step
and feature at each iteration. By allowing the model to focus less on
unpredictable data, WaveBound stabilizes the training process, thus
significantly improving generalization. With the extensive experiments, we show
that WaveBound consistently improves upon the existing models in large margins,
including the state-of-the-art model.
Related papers
- ODEStream: A Buffer-Free Online Learning Framework with ODE-based Adaptor for Streaming Time Series Forecasting [11.261457967759688]
ODEStream is a buffer-free continual learning framework that incorporates a temporal isolation layer that integrates temporal dependencies within the data.
Our approach focuses on learning how the dynamics and distribution of historical data change with time, facilitating the direct processing of streaming sequences.
Evaluations on benchmark real-world datasets demonstrate that ODEStream outperforms the state-of-the-art online learning and streaming analysis baselines.
arXiv Detail & Related papers (2024-11-11T22:36:33Z) - TimeSieve: Extracting Temporal Dynamics through Information Bottlenecks [31.10683149519954]
We propose an innovative time series forecasting model TimeSieve.
Our approach employs wavelet transforms to preprocess time series data, effectively capturing multi-scale features.
Our results validate the effectiveness of our approach in addressing the key challenges in time series forecasting.
arXiv Detail & Related papers (2024-06-07T15:58:12Z) - Loss Shaping Constraints for Long-Term Time Series Forecasting [79.3533114027664]
We present a Constrained Learning approach for long-term time series forecasting that respects a user-defined upper bound on the loss at each time-step.
We propose a practical Primal-Dual algorithm to tackle it, and aims to demonstrate that it exhibits competitive average performance in time series benchmarks, while shaping the errors across the predicted window.
arXiv Detail & Related papers (2024-02-14T18:20:44Z) - Timer: Generative Pre-trained Transformers Are Large Time Series Models [83.03091523806668]
This paper aims at the early development of large time series models (LTSM)
During pre-training, we curate large-scale datasets with up to 1 billion time points.
To meet diverse application needs, we convert forecasting, imputation, and anomaly detection of time series into a unified generative task.
arXiv Detail & Related papers (2024-02-04T06:55:55Z) - Learning Robust Precipitation Forecaster by Temporal Frame Interpolation [65.5045412005064]
We develop a robust precipitation forecasting model that demonstrates resilience against spatial-temporal discrepancies.
Our approach has led to significant improvements in forecasting precision, culminating in our model securing textit1st place in the transfer learning leaderboard of the textitWeather4cast'23 competition.
arXiv Detail & Related papers (2023-11-30T08:22:08Z) - Generative Time Series Forecasting with Diffusion, Denoise, and
Disentanglement [51.55157852647306]
Time series forecasting has been a widely explored task of great importance in many applications.
It is common that real-world time series data are recorded in a short time period, which results in a big gap between the deep model and the limited and noisy time series.
We propose to address the time series forecasting problem with generative modeling and propose a bidirectional variational auto-encoder equipped with diffusion, denoise, and disentanglement.
arXiv Detail & Related papers (2023-01-08T12:20:46Z) - Time Series Forecasting Models Copy the Past: How to Mitigate [24.397660153755997]
In the presence of noise and uncertainty, neural network models tend to replicate the last observed value of the time series.
We propose a regularization term penalizing the replication of previously seen values.
Our results indicate that the regularization term mitigates to some extent the aforementioned problem and gives rise to more robust models.
arXiv Detail & Related papers (2022-07-27T10:39:00Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Low-Rank Temporal Attention-Augmented Bilinear Network for financial
time-series forecasting [93.73198973454944]
Deep learning models have led to significant performance improvements in many problems coming from different domains, including prediction problems of financial time-series data.
The Temporal Attention-Augmented Bilinear network was recently proposed as an efficient and high-performing model for Limit Order Book time-series forecasting.
In this paper, we propose a low-rank tensor approximation of the model to further reduce the number of trainable parameters and increase its speed.
arXiv Detail & Related papers (2021-07-05T10:15:23Z) - Adjusting for Autocorrelated Errors in Neural Networks for Time Series
Regression and Forecasting [10.659189276058948]
We learn the autocorrelation coefficient jointly with the model parameters in order to adjust for autocorrelated errors.
For time series regression, large-scale experiments indicate that our method outperforms the Prais-Winsten method.
Results across a wide range of real-world datasets show that our method enhances performance in almost all cases.
arXiv Detail & Related papers (2021-01-28T04:25:51Z) - Multivariate Probabilistic Time Series Forecasting via Conditioned
Normalizing Flows [8.859284959951204]
Time series forecasting is fundamental to scientific and engineering problems.
Deep learning methods are well suited for this problem.
We show that it improves over the state-of-the-art for standard metrics on many real-world data sets.
arXiv Detail & Related papers (2020-02-14T16:16:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.