Constraining the Reionization History using Bayesian Normalizing Flows
- URL: http://arxiv.org/abs/2005.07694v1
- Date: Thu, 14 May 2020 23:00:55 GMT
- Title: Constraining the Reionization History using Bayesian Normalizing Flows
- Authors: H\'ector J. Hort\'ua, Luigi Malago, Riccardo Volpi
- Abstract summary: We present the use of Bayesian Neural Networks (BNNs) to predict the posterior distribution for four astrophysical and cosmological parameters.
Besides achieving state-of-the-art prediction performances, the proposed methods provide accurate estimation of parameters uncertainties and infer correlations among them.
- Score: 10.28074017512078
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The next generation 21 cm surveys open a new window onto the early stages of
cosmic structure formation and provide new insights about the Epoch of
Reionization (EoR). However, the non-Gaussian nature of the 21 cm signal along
with the huge amount of data generated from these surveys will require more
advanced techniques capable to efficiently extract the necessary information to
constrain the Reionization History of the Universe. In this paper we present
the use of Bayesian Neural Networks (BNNs) to predict the posterior
distribution for four astrophysical and cosmological parameters. Besides
achieving state-of-the-art prediction performances, the proposed methods
provide accurate estimation of parameters uncertainties and infer correlations
among them. Additionally, we demonstrate the advantages of Normalizing Flows
(NF) combined with BNNs, being able to model more complex output distributions
and thus capture key information as non-Gaussianities in the parameter
conditional density distribution for astrophysical and cosmological dataset.
Finally, we propose novel calibration methods employing Normalizing Flows after
training, to produce reliable predictions, and we demonstrate the advantages of
this approach both in terms of computational cost and prediction performances.
Related papers
- A Likelihood Based Approach to Distribution Regression Using Conditional Deep Generative Models [6.647819824559201]
We study the large-sample properties of a likelihood-based approach for estimating conditional deep generative models.
Our results lead to the convergence rate of a sieve maximum likelihood estimator for estimating the conditional distribution.
arXiv Detail & Related papers (2024-10-02T20:46:21Z) - Flow-Based Generative Emulation of Grids of Stellar Evolutionary Models [4.713280433864737]
We present a flow-based generative approach to emulate grids of stellar evolutionary models.
We demonstrate their ability to emulate a variety of evolutionary tracks and isochrones across a continuous range of input parameters.
arXiv Detail & Related papers (2024-07-12T16:54:17Z) - von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Diffusion posterior sampling for simulation-based inference in tall data settings [53.17563688225137]
Simulation-based inference ( SBI) is capable of approximating the posterior distribution that relates input parameters to a given observation.
In this work, we consider a tall data extension in which multiple observations are available to better infer the parameters of the model.
We compare our method to recently proposed competing approaches on various numerical experiments and demonstrate its superiority in terms of numerical stability and computational cost.
arXiv Detail & Related papers (2024-04-11T09:23:36Z) - A PAC-Bayesian Perspective on the Interpolating Information Criterion [54.548058449535155]
We show how a PAC-Bayes bound is obtained for a general class of models, characterizing factors which influence performance in the interpolating regime.
We quantify how the test error for overparameterized models achieving effectively zero training error depends on the quality of the implicit regularization imposed by e.g. the combination of model, parameter-initialization scheme.
arXiv Detail & Related papers (2023-11-13T01:48:08Z) - Joint Bayesian Inference of Graphical Structure and Parameters with a
Single Generative Flow Network [59.79008107609297]
We propose in this paper to approximate the joint posterior over the structure of a Bayesian Network.
We use a single GFlowNet whose sampling policy follows a two-phase process.
Since the parameters are included in the posterior distribution, this leaves more flexibility for the local probability models.
arXiv Detail & Related papers (2023-05-30T19:16:44Z) - Bayesian Renormalization [68.8204255655161]
We present a fully information theoretic approach to renormalization inspired by Bayesian statistical inference.
The main insight of Bayesian Renormalization is that the Fisher metric defines a correlation length that plays the role of an emergent RG scale.
We provide insight into how the Bayesian Renormalization scheme relates to existing methods for data compression and data generation.
arXiv Detail & Related papers (2023-05-17T18:00:28Z) - Probabilistic forecasting for geosteering in fluvial successions using a
generative adversarial network [0.0]
Fast updates based on real-time data are essential when drilling in complex reservoirs with high uncertainties in pre-drill models.
We propose a generative adversarial deep neural network (GAN) trained to reproduce geologically consistent 2D sections of fluvial successions.
In our example, the method reduces uncertainty and correctly predicts most major geological features up to 500 meters ahead of drill-bit.
arXiv Detail & Related papers (2022-07-04T12:52:38Z) - Data Assimilation Predictive GAN (DA-PredGAN): applied to determine the
spread of COVID-19 [0.0]
We propose the novel use of a generative adversarial network (GAN) to make predictions in time (PredGAN) and to assimilate measurements (DA-PredGAN)
GANs have received much attention recently, after achieving excellent results for their generation of realistic-looking images.
arXiv Detail & Related papers (2021-05-17T10:56:53Z) - The Bayesian Method of Tensor Networks [1.7894377200944511]
We study the Bayesian framework of the Network from two perspective.
We study the Bayesian properties of the Network by visualizing the parameters of the model and the decision boundaries in the two dimensional synthetic data set.
arXiv Detail & Related papers (2021-01-01T14:59:15Z) - Unlabelled Data Improves Bayesian Uncertainty Calibration under
Covariate Shift [100.52588638477862]
We develop an approximate Bayesian inference scheme based on posterior regularisation.
We demonstrate the utility of our method in the context of transferring prognostic models of prostate cancer across globally diverse populations.
arXiv Detail & Related papers (2020-06-26T13:50:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.