Approximate Bayesian inference and forecasting in huge-dimensional
multi-country VARs
- URL: http://arxiv.org/abs/2103.04944v1
- Date: Mon, 8 Mar 2021 18:02:59 GMT
- Title: Approximate Bayesian inference and forecasting in huge-dimensional
multi-country VARs
- Authors: Martin Feldkircher, Florian Huber, Gary Koop, Michael Pfarrhofer
- Abstract summary: The Panel Vector Autoregressive model is a popular tool for macroeconomic forecasting and structural analysis.
It allows for spillovers between countries in a very flexible fashion.
The number of parameters to be estimated can be enormous leading to over- parameterization concerns.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The Panel Vector Autoregressive (PVAR) model is a popular tool for
macroeconomic forecasting and structural analysis in multi-country applications
since it allows for spillovers between countries in a very flexible fashion.
However, this flexibility means that the number of parameters to be estimated
can be enormous leading to over-parameterization concerns. Bayesian
global-local shrinkage priors, such as the Horseshoe prior used in this paper,
can overcome these concerns, but they require the use of Markov Chain Monte
Carlo (MCMC) methods rendering them computationally infeasible in high
dimensions. In this paper, we develop computationally efficient Bayesian
methods for estimating PVARs using an integrated rotated Gaussian approximation
(IRGA). This exploits the fact that whereas own country information is often
important in PVARs, information on other countries is often unimportant. Using
an IRGA, we split the the posterior into two parts: one involving own country
coefficients, the other involving other country coefficients. Fast methods such
as approximate message passing or variational Bayes can be used on the latter
and, conditional on these, the former are estimated with precision using MCMC
methods. In a forecasting exercise involving PVARs with up to $18$ variables
for each of $38$ countries, we demonstrate that our methods produce good
forecasts quickly.
Related papers
- Sample-efficient Learning of Infinite-horizon Average-reward MDPs with General Function Approximation [53.17668583030862]
We study infinite-horizon average-reward Markov decision processes (AMDPs) in the context of general function approximation.
We propose a novel algorithmic framework named Local-fitted Optimization with OPtimism (LOOP)
We show that LOOP achieves a sublinear $tildemathcalO(mathrmpoly(d, mathrmsp(V*)) sqrtTbeta )$ regret, where $d$ and $beta$ correspond to AGEC and log-covering number of the hypothesis class respectively
arXiv Detail & Related papers (2024-04-19T06:24:22Z) - Nowcasting with Mixed Frequency Data Using Gaussian Processes [0.0]
We develop machine learning methods for mixed data sampling (MIDAS) regressions.
We use Gaussian processes (GPs) and compress the input space with structured and unstructured MIDAS variants.
arXiv Detail & Related papers (2024-02-16T11:03:07Z) - Calibrated One Round Federated Learning with Bayesian Inference in the
Predictive Space [27.259110269667826]
Federated Learning (FL) involves training a model over a dataset distributed among clients.
Small and noisy datasets are common, highlighting the need for well-calibrated models.
We propose $beta$-Predictive Bayes, a Bayesian FL algorithm that interpolates between a mixture and product of the predictive posteriors.
arXiv Detail & Related papers (2023-12-15T14:17:16Z) - MuyGPs: Scalable Gaussian Process Hyperparameter Estimation Using Local
Cross-Validation [1.2233362977312945]
We present MuyGPs, a novel efficient GP hyper parameter estimation method.
MuyGPs builds upon prior methods that take advantage of the nearest neighbors structure of the data.
We show that our method outperforms all known competitors both in terms of time-to-solution and the root mean squared error of the predictions.
arXiv Detail & Related papers (2021-04-29T18:10:21Z) - SLOE: A Faster Method for Statistical Inference in High-Dimensional
Logistic Regression [68.66245730450915]
We develop an improved method for debiasing predictions and estimating frequentist uncertainty for practical datasets.
Our main contribution is SLOE, an estimator of the signal strength with convergence guarantees that reduces the computation time of estimation and inference by orders of magnitude.
arXiv Detail & Related papers (2021-03-23T17:48:56Z) - A Bayesian Federated Learning Framework with Online Laplace
Approximation [144.7345013348257]
Federated learning allows multiple clients to collaboratively learn a globally shared model.
We propose a novel FL framework that uses online Laplace approximation to approximate posteriors on both the client and server side.
We achieve state-of-the-art results on several benchmarks, clearly demonstrating the advantages of the proposed method.
arXiv Detail & Related papers (2021-02-03T08:36:58Z) - Efficient semidefinite-programming-based inference for binary and
multi-class MRFs [83.09715052229782]
We propose an efficient method for computing the partition function or MAP estimate in a pairwise MRF.
We extend semidefinite relaxations from the typical binary MRF to the full multi-class setting, and develop a compact semidefinite relaxation that can again be solved efficiently using the solver.
arXiv Detail & Related papers (2020-12-04T15:36:29Z) - Fast Bayesian Estimation of Spatial Count Data Models [0.0]
We introduce Variational Bayes (VB) as an optimisation problem instead of a simulation problem.
A VB method is derived for posterior inference in negative binomial models with unobserved parameter and spatial dependence.
The VB approach is around 45 to 50 times faster than MCMC on a regular eight-core processor in a simulation and an empirical study.
arXiv Detail & Related papers (2020-07-07T10:24:45Z) - High-dimensional macroeconomic forecasting using message passing
algorithms [0.0]
Inference in this specification proceeds using Bayesian hierarchical priors that shrink the high-dimensional vector of coefficients.
A Generalized Approximate Message Passing (GAMP) algorithm is derived that has low algorithmic complexity and is trivially parallelizable.
arXiv Detail & Related papers (2020-04-23T23:10:04Z) - Fast and Robust Comparison of Probability Measures in Heterogeneous
Spaces [62.35667646858558]
We introduce the Anchor Energy (AE) and Anchor Wasserstein (AW) distances, which are respectively the energy and Wasserstein distances instantiated on such representations.
Our main contribution is to propose a sweep line algorithm to compute AE emphexactly in log-quadratic time, where a naive implementation would be cubic.
We show that AE and AW perform well in various experimental settings at a fraction of the computational cost of popular GW approximations.
arXiv Detail & Related papers (2020-02-05T03:09:23Z) - Parameter Space Factorization for Zero-Shot Learning across Tasks and
Languages [112.65994041398481]
We propose a Bayesian generative model for the space of neural parameters.
We infer the posteriors over such latent variables based on data from seen task-language combinations.
Our model yields comparable or better results than state-of-the-art, zero-shot cross-lingual transfer methods.
arXiv Detail & Related papers (2020-01-30T16:58:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.