Predicting Inflation with Recurrent Neural Networks
- URL: http://arxiv.org/abs/2104.03757v2
- Date: Mon, 2 Oct 2023 16:56:27 GMT
- Title: Predicting Inflation with Recurrent Neural Networks
- Authors: Livia Paranhos
- Abstract summary: This paper applies a recurrent neural network, the LSTM, to forecast inflation.
Results from an exercise with US data indicate that the estimated neural nets present competitive, but not outstanding, performance against common benchmarks.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper applies a recurrent neural network, the LSTM, to forecast
inflation. This is an appealing model for time series as it processes each time
step sequentially and explicitly learns dynamic dependencies. The paper also
explores the dimension reduction capability of the model to uncover
economically-meaningful factors that can explain the inflation process. Results
from an exercise with US data indicate that the estimated neural nets present
competitive, but not outstanding, performance against common benchmarks
(including other machine learning models). The LSTM in particular is found to
perform well at long horizons and during periods of heightened macroeconomic
uncertainty. Interestingly, LSTM-implied factors present high correlation with
business cycle indicators, informing on the usefulness of such signals as
inflation predictors. The paper also sheds light on the impact of network
initialization and architecture on forecast performance.
Related papers
- Inferring Dynamic Networks from Marginals with Iterative Proportional Fitting [57.487936697747024]
A common network inference problem, arising from real-world data constraints, is how to infer a dynamic network from its time-aggregated adjacency matrix.
We introduce a principled algorithm that guarantees IPF converges under minimal changes to the network structure.
arXiv Detail & Related papers (2024-02-28T20:24:56Z) - Cumulative Distribution Function based General Temporal Point Processes [49.758080415846884]
CuFun model represents a novel approach to TPPs that revolves around the Cumulative Distribution Function (CDF)
Our approach addresses several critical issues inherent in traditional TPP modeling.
Our contributions encompass the introduction of a pioneering CDF-based TPP model, the development of a methodology for incorporating past event information into future event prediction.
arXiv Detail & Related papers (2024-02-01T07:21:30Z) - Financial Time-Series Forecasting: Towards Synergizing Performance And
Interpretability Within a Hybrid Machine Learning Approach [2.0213537170294793]
This paper propose a comparative study on hybrid machine learning algorithms and leverage on enhancing model interpretability.
For the interpretability, we carry out a systematic overview on the preprocessing techniques of time-series statistics, including decomposition, auto-correlational function, exponential triple forecasting, which aim to excavate latent relations and complex patterns appeared in the financial time-series forecasting.
arXiv Detail & Related papers (2023-12-31T16:38:32Z) - A PAC-Bayesian Perspective on the Interpolating Information Criterion [54.548058449535155]
We show how a PAC-Bayes bound is obtained for a general class of models, characterizing factors which influence performance in the interpolating regime.
We quantify how the test error for overparameterized models achieving effectively zero training error depends on the quality of the implicit regularization imposed by e.g. the combination of model, parameter-initialization scheme.
arXiv Detail & Related papers (2023-11-13T01:48:08Z) - Inside the black box: Neural network-based real-time prediction of US recessions [0.0]
Long short-term memory (LSTM) and gated recurrent unit (GRU) are used to model US recessions from 1967 to 2021.
Shap method delivers key recession indicators, such as the S&P 500 index for short-term forecasting up to 3 months.
arXiv Detail & Related papers (2023-10-26T16:58:16Z) - A comparative assessment of deep learning models for day-ahead load
forecasting: Investigating key accuracy drivers [2.572906392867547]
Short-term load forecasting (STLF) is vital for the effective and economic operation of power grids and energy markets.
Several deep learning models have been proposed in the literature for STLF, reporting promising results.
arXiv Detail & Related papers (2023-02-23T17:11:04Z) - An advanced spatio-temporal convolutional recurrent neural network for
storm surge predictions [73.4962254843935]
We study the capability of artificial neural network models to emulate storm surge based on the storm track/size/intensity history.
This study presents a neural network model that can predict storm surge, informed by a database of synthetic storm simulations.
arXiv Detail & Related papers (2022-04-18T23:42:18Z) - Cross-Frequency Coupling Increases Memory Capacity in Oscillatory Neural
Networks [69.42260428921436]
Cross-frequency coupling (CFC) is associated with information integration across populations of neurons.
We construct a model of CFC which predicts a computational role for observed $theta - gamma$ oscillatory circuits in the hippocampus and cortex.
We show that the presence of CFC increases the memory capacity of a population of neurons connected by plastic synapses.
arXiv Detail & Related papers (2022-04-05T17:13:36Z) - Forecasting The JSE Top 40 Using Long Short-Term Memory Networks [1.6114012813668934]
This paper uses a long-short term memory network to perform financial time series forecasting on the return data of the JSE Top 40 index.
The paper concludes that the long short-term memory network outperforms the seasonal autoregressive integrated moving average model.
arXiv Detail & Related papers (2021-04-20T09:39:38Z) - HiPPO: Recurrent Memory with Optimal Polynomial Projections [93.3537706398653]
We introduce a general framework (HiPPO) for the online compression of continuous signals and discrete time series by projection onto bases.
Given a measure that specifies the importance of each time step in the past, HiPPO produces an optimal solution to a natural online function approximation problem.
This formal framework yields a new memory update mechanism (HiPPO-LegS) that scales through time to remember all history, avoiding priors on the timescale.
arXiv Detail & Related papers (2020-08-17T23:39:33Z) - Industrial Forecasting with Exponentially Smoothed Recurrent Neural
Networks [0.0]
We present a class of exponential smoothed recurrent neural networks (RNNs) which are well suited to modeling non-stationary dynamical systems arising in industrial applications.
Application of exponentially smoothed RNNs to forecasting electricity load, weather data, and stock prices highlight the efficacy of exponential smoothing of the hidden state for multi-step time series forecasting.
arXiv Detail & Related papers (2020-04-09T17:53:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.