WEITS: A Wavelet-enhanced residual framework for interpretable time series forecasting
- URL: http://arxiv.org/abs/2405.10877v1
- Date: Fri, 17 May 2024 16:09:51 GMT
- Title: WEITS: A Wavelet-enhanced residual framework for interpretable time series forecasting
- Authors: Ziyou Guo, Yan Sun, Tieru Wu,
- Abstract summary: WEITS is a frequency-aware deep learning framework that is highly interpretable and computationally efficient.
In this paper, we present WEITS, a frequency-aware deep learning framework that is highly interpretable and computationally efficient.
- Score: 3.1551278097133895
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Time series (TS) forecasting has been an unprecedentedly popular problem in recent years, with ubiquitous applications in both scientific and business fields. Various approaches have been introduced to time series analysis, including both statistical approaches and deep neural networks. Although neural network approaches have illustrated stronger ability of representation than statistical methods, they struggle to provide sufficient interpretablility, and can be too complicated to optimize. In this paper, we present WEITS, a frequency-aware deep learning framework that is highly interpretable and computationally efficient. Through multi-level wavelet decomposition, WEITS novelly infuses frequency analysis into a highly deep learning framework. Combined with a forward-backward residual architecture, it enjoys both high representation capability and statistical interpretability. Extensive experiments on real-world datasets have demonstrated competitive performance of our model, along with its additional advantage of high computation efficiency. Furthermore, WEITS provides a general framework that can always seamlessly integrate with state-of-the-art approaches for time series forecast.
Related papers
- Deep End-to-End Survival Analysis with Temporal Consistency [49.77103348208835]
We present a novel Survival Analysis algorithm designed to efficiently handle large-scale longitudinal data.
A central idea in our method is temporal consistency, a hypothesis that past and future outcomes in the data evolve smoothly over time.
Our framework uniquely incorporates temporal consistency into large datasets by providing a stable training signal.
arXiv Detail & Related papers (2024-10-09T11:37:09Z) - TSI: A Multi-View Representation Learning Approach for Time Series Forecasting [29.05140751690699]
This study introduces a novel multi-view approach for time series forecasting.
It integrates trend and seasonal representations with an Independent Component Analysis (ICA)-based representation.
This approach offers a holistic understanding of time series data, going beyond traditional models that often miss nuanced, nonlinear relationships.
arXiv Detail & Related papers (2024-09-30T02:11:57Z) - An Efficient and Generalizable Symbolic Regression Method for Time Series Analysis [13.530431636528519]
We propose textbfNeural-textbfEnhanced textbfMonte-Carlo textbfTree textbfSearch (NEMoTS) for time series.
NEMoTS significantly reduces the search space in symbolic regression and improves expression quality.
Experiments with three real-world datasets demonstrate NEMoTS's significant superiority in performance, efficiency, reliability, and interpretability.
arXiv Detail & Related papers (2024-09-06T02:20:13Z) - TSI-Bench: Benchmarking Time Series Imputation [52.27004336123575]
TSI-Bench is a comprehensive benchmark suite for time series imputation utilizing deep learning techniques.
The TSI-Bench pipeline standardizes experimental settings to enable fair evaluation of imputation algorithms.
TSI-Bench innovatively provides a systematic paradigm to tailor time series forecasting algorithms for imputation purposes.
arXiv Detail & Related papers (2024-06-18T16:07:33Z) - MGCP: A Multi-Grained Correlation based Prediction Network for Multivariate Time Series [54.91026286579748]
We propose a Multi-Grained Correlations-based Prediction Network.
It simultaneously considers correlations at three levels to enhance prediction performance.
It employs adversarial training with an attention mechanism-based predictor and conditional discriminator to optimize prediction results at coarse-grained level.
arXiv Detail & Related papers (2024-05-30T03:32:44Z) - SFANet: Spatial-Frequency Attention Network for Weather Forecasting [54.470205739015434]
Weather forecasting plays a critical role in various sectors, driving decision-making and risk management.
Traditional methods often struggle to capture the complex dynamics of meteorological systems.
We propose a novel framework designed to address these challenges and enhance the accuracy of weather prediction.
arXiv Detail & Related papers (2024-05-29T08:00:15Z) - Scalable Spatiotemporal Prediction with Bayesian Neural Fields [3.3299088915999295]
BayesNF is a novel deep neural network architecture for high-capacity function estimation.
We evaluate BayesNF against statistical machine-learning prediction problems from climate and public health datasets.
arXiv Detail & Related papers (2024-03-12T13:47:50Z) - Interpretable Short-Term Load Forecasting via Multi-Scale Temporal
Decomposition [3.080999981940039]
This paper proposes an interpretable deep learning method, which learns a linear combination of neural networks that each attends to an input time feature.
Case studies have been carried out on the Belgium central grid load dataset and the proposed model demonstrated better accuracy compared to the frequently applied baseline model.
arXiv Detail & Related papers (2024-02-18T17:55:59Z) - A Survey on Deep Learning based Time Series Analysis with Frequency
Transformation [74.3919960186696]
Frequency transformation (FT) has been increasingly incorporated into deep learning models to enhance state-of-the-art accuracy and efficiency in time series analysis.
Despite the growing attention and the proliferation of research in this emerging field, there is currently a lack of a systematic review and in-depth analysis of deep learning-based time series models with FT.
We present a comprehensive review that systematically investigates and summarizes the recent research advancements in deep learning-based time series analysis with FT.
arXiv Detail & Related papers (2023-02-04T14:33:07Z) - Probabilistic AutoRegressive Neural Networks for Accurate Long-range
Forecasting [6.295157260756792]
We introduce the Probabilistic AutoRegressive Neural Networks (PARNN)
PARNN is capable of handling complex time series data exhibiting non-stationarity, nonlinearity, non-seasonality, long-range dependence, and chaotic patterns.
We evaluate the performance of PARNN against standard statistical, machine learning, and deep learning models, including Transformers, NBeats, and DeepAR.
arXiv Detail & Related papers (2022-04-01T17:57:36Z) - Leveraging the structure of dynamical systems for data-driven modeling [111.45324708884813]
We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
arXiv Detail & Related papers (2021-12-15T20:09:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.