Temporal-Spatial dependencies ENhanced deep learning model (TSEN) for
household leverage series forecasting
- URL: http://arxiv.org/abs/2210.08668v1
- Date: Mon, 17 Oct 2022 00:10:25 GMT
- Title: Temporal-Spatial dependencies ENhanced deep learning model (TSEN) for
household leverage series forecasting
- Authors: Hu Yang, Yi Huang, Haijun Wang, Yu Chen
- Abstract summary: Analyzing both temporal and spatial patterns for an accurate forecasting model for financial time series forecasting is a challenge.
Inspired by the successful applications of deep learning, we propose a new model to resolve the issues of forecasting household leverage in China.
Results show that the new approach can capture the temporal-spatial dynamics of household leverage well and get more accurate and solid predictive results.
- Score: 12.727583657383073
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Analyzing both temporal and spatial patterns for an accurate forecasting
model for financial time series forecasting is a challenge due to the complex
nature of temporal-spatial dynamics: time series from different locations often
have distinct patterns; and for the same time series, patterns may vary as time
goes by. Inspired by the successful applications of deep learning, we propose a
new model to resolve the issues of forecasting household leverage in China. Our
solution consists of multiple RNN-based layers and an attention layer: each
RNN-based layer automatically learns the temporal pattern of a specific series
with multivariate exogenous series, and then the attention layer learns the
spatial correlative weight and obtains the global representations
simultaneously. The results show that the new approach can capture the
temporal-spatial dynamics of household leverage well and get more accurate and
solid predictive results. More, the simulation also studies show that
clustering and choosing correlative series are necessary to obtain accurate
forecasting results.
Related papers
- A Comprehensive Survey of Time Series Forecasting: Architectural Diversity and Open Challenges [37.20655606514617]
Time series forecasting is a critical task that provides key information for decision-making across various fields.
Deep learning architectures such as ass, CNNs, RNNs, and GNNs have been developed and applied to solve time series forecasting problems.
Transformer models, which excel at handling long-term dependencies, have become significant architectural components for time series forecasting.
arXiv Detail & Related papers (2024-10-24T07:43:55Z) - TSI: A Multi-View Representation Learning Approach for Time Series Forecasting [29.05140751690699]
This study introduces a novel multi-view approach for time series forecasting.
It integrates trend and seasonal representations with an Independent Component Analysis (ICA)-based representation.
This approach offers a holistic understanding of time series data, going beyond traditional models that often miss nuanced, nonlinear relationships.
arXiv Detail & Related papers (2024-09-30T02:11:57Z) - PDETime: Rethinking Long-Term Multivariate Time Series Forecasting from
the perspective of partial differential equations [49.80959046861793]
We present PDETime, a novel LMTF model inspired by the principles of Neural PDE solvers.
Our experimentation across seven diversetemporal real-world LMTF datasets reveals that PDETime adapts effectively to the intrinsic nature of the data.
arXiv Detail & Related papers (2024-02-25T17:39:44Z) - RPMixer: Shaking Up Time Series Forecasting with Random Projections for Large Spatial-Temporal Data [33.0546525587517]
We propose a all-Multi-Layer Perceptron (all-MLP) time series forecasting architecture called RPMixer.
Our method capitalizes on the ensemble-like behavior of deep neural networks, where each individual block behaves like a base learner in an ensemble model.
arXiv Detail & Related papers (2024-02-16T07:28:59Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Time Series Forecasting with Ensembled Stochastic Differential Equations
Driven by L\'evy Noise [2.3076895420652965]
We use a collection of SDEs equipped with neural networks to predict long-term trend of noisy time series.
Our contributions are, first, we use the phase space reconstruction method to extract intrinsic dimension of the time series data.
Second, we explore SDEs driven by $alpha$-stable L'evy motion to model the time series data and solve the problem through neural network approximation.
arXiv Detail & Related papers (2021-11-25T16:49:01Z) - Deep Autoregressive Models with Spectral Attention [74.08846528440024]
We propose a forecasting architecture that combines deep autoregressive models with a Spectral Attention (SA) module.
By characterizing in the spectral domain the embedding of the time series as occurrences of a random process, our method can identify global trends and seasonality patterns.
Two spectral attention models, global and local to the time series, integrate this information within the forecast and perform spectral filtering to remove time series's noise.
arXiv Detail & Related papers (2021-07-13T11:08:47Z) - Model-Attentive Ensemble Learning for Sequence Modeling [86.4785354333566]
We present Model-Attentive Ensemble learning for Sequence modeling (MAES)
MAES is a mixture of time-series experts which leverages an attention-based gating mechanism to specialize the experts on different sequence dynamics and adaptively weight their predictions.
We demonstrate that MAES significantly out-performs popular sequence models on datasets subject to temporal shift.
arXiv Detail & Related papers (2021-02-23T05:23:35Z) - Synergetic Learning of Heterogeneous Temporal Sequences for
Multi-Horizon Probabilistic Forecasting [48.8617204809538]
We propose Variational Synergetic Multi-Horizon Network (VSMHN), a novel deep conditional generative model.
To learn complex correlations across heterogeneous sequences, a tailored encoder is devised to combine the advances in deep point processes models and variational recurrent neural networks.
Our model can be trained effectively using variational inference and generates predictions with Monte-Carlo simulation.
arXiv Detail & Related papers (2021-01-31T11:00:55Z) - Global Models for Time Series Forecasting: A Simulation Study [2.580765958706854]
We simulate time series from simple data generating processes (DGP), such as Auto Regressive (AR) and Seasonal AR, to complex DGPs, such as Chaotic Logistic Map, Self-Exciting Threshold Auto-Regressive, and Mackey-Glass equations.
The lengths and the number of series in the dataset are varied in different scenarios.
We perform experiments on these datasets using global forecasting models including Recurrent Neural Networks (RNN), Feed-Forward Neural Networks, Pooled Regression (PR) models, and Light Gradient Boosting Models (LGBM)
arXiv Detail & Related papers (2020-12-23T04:45:52Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.