Inside the black box: Neural network-based real-time prediction of US recessions
- URL: http://arxiv.org/abs/2310.17571v3
- Date: Thu, 23 May 2024 15:51:59 GMT
- Title: Inside the black box: Neural network-based real-time prediction of US recessions
- Authors: Seulki Chung,
- Abstract summary: Long short-term memory (LSTM) and gated recurrent unit (GRU) are used to model US recessions from 1967 to 2021.
Shap method delivers key recession indicators, such as the S&P 500 index for short-term forecasting up to 3 months.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Long short-term memory (LSTM) and gated recurrent unit (GRU) are used to model US recessions from 1967 to 2021. Their predictive performances are compared to those of the traditional linear models. The out-of-sample performance suggests the application of LSTM and GRU in recession forecasting, especially for longer-term forecasts. The Shapley additive explanations (SHAP) method is applied to both groups of models. The SHAP-based different weight assignments imply the capability of these types of neural networks to capture the business cycle asymmetries and nonlinearities. The SHAP method delivers key recession indicators, such as the S&P 500 index for short-term forecasting up to 3 months and the term spread for longer-term forecasting up to 12 months. These findings are robust against other interpretation methods, such as the local interpretable model-agnostic explanations (LIME) and the marginal effects.
Related papers
- Weather Prediction with Diffusion Guided by Realistic Forecast Processes [49.07556359513563]
We introduce a novel method that applies diffusion models (DM) for weather forecasting.
Our method can achieve both direct and iterative forecasting with the same modeling framework.
The flexibility and controllability of our model empowers a more trustworthy DL system for the general weather community.
arXiv Detail & Related papers (2024-02-06T21:28:42Z) - Counterfactual Explanations for Time Series Forecasting [14.03870816983583]
We formulate the novel problem of counterfactual generation for time series forecasting, and propose an algorithm, called ForecastCF.
ForecastCF solves the problem by applying gradient-based perturbations to the original time series.
Our results show that ForecastCF outperforms the baseline in terms of counterfactual validity and data manifold closeness.
arXiv Detail & Related papers (2023-10-12T08:51:59Z) - Performative Time-Series Forecasting [71.18553214204978]
We formalize performative time-series forecasting (PeTS) from a machine-learning perspective.
We propose a novel approach, Feature Performative-Shifting (FPS), which leverages the concept of delayed response to anticipate distribution shifts.
We conduct comprehensive experiments using multiple time-series models on COVID-19 and traffic forecasting tasks.
arXiv Detail & Related papers (2023-10-09T18:34:29Z) - Forecasting inflation using disaggregates and machine learning [0.0]
We consider different disaggregation levels for inflation and employ a range of traditional time series techniques as well as linear and nonlinear machine learning (ML) models to deal with a larger number of predictors.
For many forecast horizons, the aggregation of disaggregated forecasts performs just as well survey-based expectations and models that generate forecasts using the aggregate directly.
Our results reinforce the benefits of using models in a data-rich environment for inflation forecasting, including aggregating disaggregated forecasts from ML techniques.
arXiv Detail & Related papers (2023-08-22T04:01:40Z) - When in Doubt: Neural Non-Parametric Uncertainty Quantification for
Epidemic Forecasting [70.54920804222031]
Most existing forecasting models disregard uncertainty quantification, resulting in mis-calibrated predictions.
Recent works in deep neural models for uncertainty-aware time-series forecasting also have several limitations.
We model the forecasting task as a probabilistic generative process and propose a functional neural process model called EPIFNP.
arXiv Detail & Related papers (2021-06-07T18:31:47Z) - Learning representations with end-to-end models for improved remaining
useful life prognostics [64.80885001058572]
The remaining Useful Life (RUL) of equipment is defined as the duration between the current time and its failure.
We propose an end-to-end deep learning model based on multi-layer perceptron and long short-term memory layers (LSTM) to predict the RUL.
We will discuss how the proposed end-to-end model is able to achieve such good results and compare it to other deep learning and state-of-the-art methods.
arXiv Detail & Related papers (2021-04-11T16:45:18Z) - Predicting Inflation with Recurrent Neural Networks [0.0]
This paper applies a recurrent neural network, the LSTM, to forecast inflation.
Results from an exercise with US data indicate that the estimated neural nets present competitive, but not outstanding, performance against common benchmarks.
arXiv Detail & Related papers (2021-04-08T13:19:26Z) - Learning Interpretable Deep State Space Model for Probabilistic Time
Series Forecasting [98.57851612518758]
Probabilistic time series forecasting involves estimating the distribution of future based on its history.
We propose a deep state space model for probabilistic time series forecasting whereby the non-linear emission model and transition model are parameterized by networks.
We show in experiments that our model produces accurate and sharp probabilistic forecasts.
arXiv Detail & Related papers (2021-01-31T06:49:33Z) - Forecasting Commodity Prices Using Long Short-Term Memory Neural
Networks [0.0]
This paper applies a recurrent neural network (RNN) method to forecast cotton and oil prices.
We show that machine learning methods fit reasonably well the data but do not outperform systematically classical methods.
arXiv Detail & Related papers (2021-01-08T16:28:19Z) - Deep Stock Predictions [58.720142291102135]
We consider the design of a trading strategy that performs portfolio optimization using Long Short Term Memory (LSTM) neural networks.
We then customize the loss function used to train the LSTM to increase the profit earned.
We find the LSTM model with the customized loss function to have an improved performance in the training bot over a regressive baseline such as ARIMA.
arXiv Detail & Related papers (2020-06-08T23:37:47Z) - Spatiotemporal Adaptive Neural Network for Long-term Forecasting of
Financial Time Series [0.2793095554369281]
We investigate whether deep neural networks (DNNs) can be used to forecast time series (TS) forecasts conjointly.
We make use of the dynamic factor graph (DFG) to build a multivariate autoregressive model.
With ACTM, it is possible to vary the autoregressive order of a TS model over time and model a larger set of probability distributions.
arXiv Detail & Related papers (2020-03-27T00:53:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.