Macroeconomic forecasting with LSTM and mixed frequency time series data
- URL: http://arxiv.org/abs/2109.13777v1
- Date: Tue, 28 Sep 2021 14:56:37 GMT
- Title: Macroeconomic forecasting with LSTM and mixed frequency time series data
- Authors: Sarun Kamolthip
- Abstract summary: We first present how theconventional LSTM model can be adapted to the time series observed at mixed frequencies.
We then adopt the unrestricted Mixed DAtaSampling scheme (U-MIDAS) into the LSTM architecture.
Our proposed model could be very helpful in the period of large economic downturns for short-termforecast.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: This paper demonstrates the potentials of the long short-term memory (LSTM)
when applyingwith macroeconomic time series data sampled at different
frequencies. We first present how theconventional LSTM model can be adapted to
the time series observed at mixed frequencies when thesame mismatch ratio is
applied for all pairs of low-frequency output and higher-frequency variable.
Togeneralize the LSTM to the case of multiple mismatch ratios, we adopt the
unrestricted Mixed DAtaSampling (U-MIDAS) scheme (Foroni et al., 2015) into the
LSTM architecture. We assess via bothMonte Carlo simulations and empirical
application the out-of-sample predictive performance. Ourproposed models
outperform the restricted MIDAS model even in a set up favorable to the
MIDASestimator. For real world application, we study forecasting a quarterly
growth rate of Thai realGDP using a vast array of macroeconomic indicators both
quarterly and monthly. Our LSTM withU-MIDAS scheme easily beats the simple
benchmark AR(1) model at all horizons, but outperformsthe strong benchmark
univariate LSTM only at one and six months ahead. Nonetheless, we find thatour
proposed model could be very helpful in the period of large economic downturns
for short-termforecast. Simulation and empirical results seem to support the
use of our proposed LSTM withU-MIDAS scheme to nowcasting application.
Related papers
- Margin Matching Preference Optimization: Enhanced Model Alignment with Granular Feedback [64.67540769692074]
Large language models (LLMs) fine-tuned with alignment techniques, such as reinforcement learning from human feedback, have been instrumental in developing some of the most capable AI systems to date.
We introduce an approach called Margin Matching Preference Optimization (MMPO), which incorporates relative quality margins into optimization, leading to improved LLM policies and reward models.
Experiments with both human and AI feedback data demonstrate that MMPO consistently outperforms baseline methods, often by a substantial margin, on popular benchmarks including MT-bench and RewardBench.
arXiv Detail & Related papers (2024-10-04T04:56:11Z) - Online Variational Sequential Monte Carlo [49.97673761305336]
We build upon the variational sequential Monte Carlo (VSMC) method, which provides computationally efficient and accurate model parameter estimation and Bayesian latent-state inference.
Online VSMC is capable of performing efficiently, entirely on-the-fly, both parameter estimation and particle proposal adaptation.
arXiv Detail & Related papers (2023-12-19T21:45:38Z) - Simultaneous Machine Translation with Large Language Models [51.470478122113356]
We investigate the possibility of applying Large Language Models to SimulMT tasks.
We conducted experiments using the textttLlama2-7b-chat model on nine different languages from the MUST-C dataset.
The results show that LLM outperforms dedicated MT models in terms of BLEU and LAAL metrics.
arXiv Detail & Related papers (2023-09-13T04:06:47Z) - Score-based Generative Modeling in Latent Space [93.8985523558869]
Score-based generative models (SGMs) have recently demonstrated impressive results in terms of both sample quality and distribution coverage.
Here, we propose the Latent Score-based Generative Model (LSGM), a novel approach that trains SGMs in a latent space.
Moving from data to latent space allows us to train more expressive generative models, apply SGMs to non-continuous data, and learn smoother SGMs in a smaller space.
arXiv Detail & Related papers (2021-06-10T17:26:35Z) - Learning representations with end-to-end models for improved remaining
useful life prognostics [64.80885001058572]
The remaining Useful Life (RUL) of equipment is defined as the duration between the current time and its failure.
We propose an end-to-end deep learning model based on multi-layer perceptron and long short-term memory layers (LSTM) to predict the RUL.
We will discuss how the proposed end-to-end model is able to achieve such good results and compare it to other deep learning and state-of-the-art methods.
arXiv Detail & Related papers (2021-04-11T16:45:18Z) - Prediction of financial time series using LSTM and data denoising
methods [0.29923891863939933]
This paper proposes an ensemble method based on data denoising methods, including the wavelet transform (WT) and singular spectrum analysis (SSA)
As WT and SSA can extract useful information from the original sequence and avoid overfitting, the hybrid model can better grasp the sequence pattern of the closing price of the DJIA.
arXiv Detail & Related papers (2021-03-05T07:32:36Z) - Time Series Analysis and Forecasting of COVID-19 Cases Using LSTM and
ARIMA Models [4.56877715768796]
Coronavirus disease 2019 (COVID-19) is a global public health crisis that has been declared a pandemic by World Health Organization.
This study explores the performance of several Long Short-Term Memory (LSTM) models and Auto-Regressive Integrated Moving Average (ARIMA) model in forecasting the number of confirmed COVID-19 cases.
arXiv Detail & Related papers (2020-06-05T20:07:48Z) - Long short-term memory networks and laglasso for bond yield forecasting:
Peeping inside the black box [10.412912723760172]
We conduct the first study of bond yield forecasting using long short-term memory (LSTM) networks.
We calculate the LSTM signals through time, at selected locations in the memory cell, using sequence-to-sequence architectures.
arXiv Detail & Related papers (2020-05-05T14:23:00Z) - A Hybrid Residual Dilated LSTM end Exponential Smoothing Model for
Mid-Term Electric Load Forecasting [1.1602089225841632]
The model combines exponential smoothing (ETS), advanced Long Short-Term Memory (LSTM) and ensembling.
A simulation study performed on the monthly electricity demand time series for 35 European countries confirmed the high performance of the proposed model.
arXiv Detail & Related papers (2020-03-29T10:53:50Z) - High-Accuracy and Low-Latency Speech Recognition with Two-Head
Contextual Layer Trajectory LSTM Model [46.34788932277904]
We improve conventional hybrid LSTM acoustic models for high-accuracy and low-latency automatic speech recognition.
To achieve high accuracy, we use a contextual layer trajectory LSTM (cltLSTM), which decouples the temporal modeling and target classification tasks.
We further improve the training strategy with sequence-level teacher-student learning.
arXiv Detail & Related papers (2020-03-17T00:52:11Z) - A Bayesian Long Short-Term Memory Model for Value at Risk and Expected
Shortfall Joint Forecasting [26.834110647177965]
Value-at-Risk (VaR) and Expected Shortfall (ES) are widely used in the financial sector to measure the market risk and manage the extreme market movement.
Recent link between the quantile score function and the Asymmetric Laplace density has led to a flexible likelihood-based framework for joint modelling of VaR and ES.
We develop a hybrid model that is based on the Asymmetric Laplace quasi-likelihood and employs the Long Short-Term Memory (LSTM) time series modelling technique from Machine Learning to capture efficiently the underlying dynamics of VaR and ES.
arXiv Detail & Related papers (2020-01-23T05:13:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.