Advancing GDP Forecasting: The Potential of Machine Learning Techniques in Economic Predictions
- URL: http://arxiv.org/abs/2502.19807v1
- Date: Thu, 27 Feb 2025 06:28:13 GMT
- Title: Advancing GDP Forecasting: The Potential of Machine Learning Techniques in Economic Predictions
- Authors: Bogdan Oancea,
- Abstract summary: This paper investigates the efficacy of Recurrent Neural Networks, in forecasting GDP, specifically LSTM networks.<n>We employ the quarterly Romanian GDP dataset from 1995 to 2023 and build a LSTM network to forecast to next 4 values in the series.<n>Our findings suggest that machine learning models, consistently outperform traditional econometric models in terms of predictive accuracy and flexibility.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: The quest for accurate economic forecasting has traditionally been dominated by econometric models, which most of the times rely on the assumptions of linear relationships and stationarity in of the data. However, the complex and often nonlinear nature of global economies necessitates the exploration of alternative approaches. Machine learning methods offer promising advantages over traditional econometric techniques for Gross Domestic Product forecasting, given their ability to model complex, nonlinear interactions and patterns without the need for explicit specification of the underlying relationships. This paper investigates the efficacy of Recurrent Neural Networks, in forecasting GDP, specifically LSTM networks. These models are compared against a traditional econometric method, SARIMA. We employ the quarterly Romanian GDP dataset from 1995 to 2023 and build a LSTM network to forecast to next 4 values in the series. Our findings suggest that machine learning models, consistently outperform traditional econometric models in terms of predictive accuracy and flexibility
Related papers
- A Theoretical Perspective: How to Prevent Model Collapse in Self-consuming Training Loops [55.07063067759609]
High-quality data is essential for training large generative models, yet the vast reservoir of real data available online has become nearly depleted.<n>Models increasingly generate their own data for further training, forming Self-consuming Training Loops (STLs)<n>Some models degrade or even collapse, while others successfully avoid these failures, leaving a significant gap in theoretical understanding.
arXiv Detail & Related papers (2025-02-26T06:18:13Z) - Energy Price Modelling: A Comparative Evaluation of four Generations of Forecasting Methods [45.30624270004584]
Energy price forecasting plays an important role in supporting decision-making at various levels.
Given the evolving landscape of forecasting techniques, the literature lacks a thorough empirical comparison.
This paper provides an in-depth review of the evolution of forecasting modeling frameworks.
arXiv Detail & Related papers (2024-11-05T11:45:00Z) - Machine Learning for Economic Forecasting: An Application to China's GDP Growth [2.899333881379661]
This paper employs various machine learning models to predict the quarterly real GDP growth of China.
It analyzes the factors contributing to the performance differences among these models.
arXiv Detail & Related papers (2024-07-04T03:04:55Z) - Relational Learning in Pre-Trained Models: A Theory from Hypergraph Recovery Perspective [60.64922606733441]
We introduce a mathematical model that formalizes relational learning as hypergraph recovery to study pre-training of Foundation Models (FMs)
In our framework, the world is represented as a hypergraph, with data abstracted as random samples from hyperedges. We theoretically examine the feasibility of a Pre-Trained Model (PTM) to recover this hypergraph and analyze the data efficiency in a minimax near-optimal style.
arXiv Detail & Related papers (2024-06-17T06:20:39Z) - End-to-End Reinforcement Learning of Koopman Models for Economic Nonlinear Model Predictive Control [45.84205238554709]
We present a method for reinforcement learning of Koopman surrogate models for optimal performance as part of (e)NMPC.
We show that the end-to-end trained models outperform those trained using system identification in (e)NMPC.
arXiv Detail & Related papers (2023-08-03T10:21:53Z) - Dynamic Term Structure Models with Nonlinearities using Gaussian
Processes [0.0]
We propose a generalized modelling setup for Gaussian DTSMs which allows for unspanned nonlinear associations between the two.
We construct a custom sequential Monte Carlo estimation and forecasting scheme where we introduce Gaussian Process priors to model nonlinearities.
Unlike for real economic activity, in case of core inflation we find that, compared to linear models, application of nonlinear models leads to statistically significant gains in economic value across considered maturities.
arXiv Detail & Related papers (2023-05-18T14:24:17Z) - Bayesian Neural Networks for Macroeconomic Analysis [0.0]
We develop Bayesian neural networks (BNNs) that are well-suited for handling datasets commonly used for macroeconomic analysis in policy institutions.
Our approach avoids extensive specification searches through a novel mixture specification for the activation function.
We show that our BNNs produce precise density forecasts, typically better than those from other machine learning methods.
arXiv Detail & Related papers (2022-11-09T09:10:57Z) - Probabilistic AutoRegressive Neural Networks for Accurate Long-range
Forecasting [6.295157260756792]
We introduce the Probabilistic AutoRegressive Neural Networks (PARNN)
PARNN is capable of handling complex time series data exhibiting non-stationarity, nonlinearity, non-seasonality, long-range dependence, and chaotic patterns.
We evaluate the performance of PARNN against standard statistical, machine learning, and deep learning models, including Transformers, NBeats, and DeepAR.
arXiv Detail & Related papers (2022-04-01T17:57:36Z) - Bayesian Bilinear Neural Network for Predicting the Mid-price Dynamics
in Limit-Order Book Markets [84.90242084523565]
Traditional time-series econometric methods often appear incapable of capturing the true complexity of the multi-level interactions driving the price dynamics.
By adopting a state-of-the-art second-order optimization algorithm, we train a Bayesian bilinear neural network with temporal attention.
By addressing the use of predictive distributions to analyze errors and uncertainties associated with the estimated parameters and model forecasts, we thoroughly compare our Bayesian model with traditional ML alternatives.
arXiv Detail & Related papers (2022-03-07T18:59:54Z) - A Free Lunch with Influence Functions? Improving Neural Network
Estimates with Concepts from Semiparametric Statistics [41.99023989695363]
We explore the potential for semiparametric theory to be used to improve neural networks and machine learning algorithms.
We propose a new neural network method MultiNet, which seeks the flexibility and diversity of an ensemble using a single architecture.
arXiv Detail & Related papers (2022-02-18T09:35:51Z) - Economic Recession Prediction Using Deep Neural Network [26.504845007567972]
We identify the deep learning methodology of Bi-LSTM with Autoencoder as the most accurate model to forecast the beginning and end of economic recessions in the U.S.
We adopt commonly-available macro and market-condition features to compare the ability of different machine learning models to generate good predictions both in-sample and out-of-sample.
arXiv Detail & Related papers (2021-07-21T22:55:14Z) - Generative Temporal Difference Learning for Infinite-Horizon Prediction [101.59882753763888]
We introduce the $gamma$-model, a predictive model of environment dynamics with an infinite probabilistic horizon.
We discuss how its training reflects an inescapable tradeoff between training-time and testing-time compounding errors.
arXiv Detail & Related papers (2020-10-27T17:54:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.