Variational Inference for GARCH-family Models
- URL: http://arxiv.org/abs/2310.03435v1
- Date: Thu, 5 Oct 2023 10:21:31 GMT
- Title: Variational Inference for GARCH-family Models
- Authors: Martin Magris, Alexandros Iosifidis
- Abstract summary: Variational Inference is a robust approach for Bayesian inference in machine learning models.
We show that Variational Inference is an attractive, remarkably well-calibrated, and competitive method for Bayesian learning.
- Score: 84.84082555964086
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The Bayesian estimation of GARCH-family models has been typically addressed
through Monte Carlo sampling. Variational Inference is gaining popularity and
attention as a robust approach for Bayesian inference in complex machine
learning models; however, its adoption in econometrics and finance is limited.
This paper discusses the extent to which Variational Inference constitutes a
reliable and feasible alternative to Monte Carlo sampling for Bayesian
inference in GARCH-like models. Through a large-scale experiment involving the
constituents of the S&P 500 index, several Variational Inference optimizers, a
variety of volatility models, and a case study, we show that Variational
Inference is an attractive, remarkably well-calibrated, and competitive method
for Bayesian learning.
Related papers
- Robust Inference of Dynamic Covariance Using Wishart Processes and Sequential Monte Carlo [2.6347238599620115]
We introduce a Sequential Monte Carlo (SMC) sampler for the Wishart process.
We show that SMC sampling results in the most robust estimates and out-of-sample predictions of dynamic covariance.
We demonstrate the practical applicability of our proposed approach on a dataset of clinical depression.
arXiv Detail & Related papers (2024-06-07T09:48:11Z) - Bayesian learning of Causal Structure and Mechanisms with GFlowNets and Variational Bayes [51.84122462615402]
We introduce a novel method to learn the structure and mechanisms of the causal model using Variational Bayes-DAG-GFlowNet.
We extend the method of Bayesian causal structure learning using GFlowNets to learn the parameters of a linear-Gaussian model.
arXiv Detail & Related papers (2022-11-04T21:57:39Z) - Quasi Black-Box Variational Inference with Natural Gradients for
Bayesian Learning [84.90242084523565]
We develop an optimization algorithm suitable for Bayesian learning in complex models.
Our approach relies on natural gradient updates within a general black-box framework for efficient training with limited model-specific derivations.
arXiv Detail & Related papers (2022-05-23T18:54:27Z) - A Variational Approach to Bayesian Phylogenetic Inference [7.251627034538359]
We present a variational framework for Bayesian phylogenetic analysis.
We train the variational approximation via Markov gradient ascent and adopt estimators for continuous and discrete variational parameters.
Experiments on a benchmark of challenging real data phylogenetic inference problems demonstrate the effectiveness and efficiency of our methods.
arXiv Detail & Related papers (2022-04-16T08:23:48Z) - Recursive Monte Carlo and Variational Inference with Auxiliary Variables [64.25762042361839]
Recursive auxiliary-variable inference (RAVI) is a new framework for exploiting flexible proposals.
RAVI generalizes and unifies several existing methods for inference with expressive expressive families.
We show RAVI's design framework and theorems by using them to analyze and improve upon Salimans et al.'s Markov Chain Variational Inference.
arXiv Detail & Related papers (2022-03-05T23:52:40Z) - Surrogate Likelihoods for Variational Annealed Importance Sampling [11.144915453864854]
We introduce a surrogate likelihood that can be learned jointly with other variational parameters.
We show that our method performs well in practice and that it is well-suited for black-box inference in probabilistic programming frameworks.
arXiv Detail & Related papers (2021-12-22T19:49:45Z) - Approximate Bayesian inference from noisy likelihoods with Gaussian
process emulated MCMC [0.24275655667345403]
We model the log-likelihood function using a Gaussian process (GP)
The main methodological innovation is to apply this model to emulate the progression that an exact Metropolis-Hastings (MH) sampler would take.
The resulting approximate sampler is conceptually simple and sample-efficient.
arXiv Detail & Related papers (2021-04-08T17:38:02Z) - Scalable Control Variates for Monte Carlo Methods via Stochastic
Optimization [62.47170258504037]
This paper presents a framework that encompasses and generalizes existing approaches that use controls, kernels and neural networks.
Novel theoretical results are presented to provide insight into the variance reduction that can be achieved, and an empirical assessment, including applications to Bayesian inference, is provided in support.
arXiv Detail & Related papers (2020-06-12T22:03:25Z) - Decision-Making with Auto-Encoding Variational Bayes [71.44735417472043]
We show that a posterior approximation distinct from the variational distribution should be used for making decisions.
Motivated by these theoretical results, we propose learning several approximate proposals for the best model.
In addition to toy examples, we present a full-fledged case study of single-cell RNA sequencing.
arXiv Detail & Related papers (2020-02-17T19:23:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.