Probabilistic Mass Mapping with Neural Score Estimation
- URL: http://arxiv.org/abs/2201.05561v2
- Date: Mon, 17 Jan 2022 13:47:17 GMT
- Title: Probabilistic Mass Mapping with Neural Score Estimation
- Authors: Benjamin Remy, Francois Lanusse, Niall Jeffrey, Jia Liu, Jean-Luc
Starck, Ken Osato, Tim Schrabback
- Abstract summary: We introduce a novel methodology for efficient sampling of the high-dimensional Bayesian posterior of the weak lensing mass-mapping problem.
We aim to demonstrate the accuracy of the method on simulations, and then proceed to applying it to the mass reconstruction of the HST/ACS COSMOS field.
- Score: 4.079848600120986
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Weak lensing mass-mapping is a useful tool to access the full distribution of
dark matter on the sky, but because of intrinsic galaxy ellipticies and finite
fields/missing data, the recovery of dark matter maps constitutes a challenging
ill-posed inverse problem. We introduce a novel methodology allowing for
efficient sampling of the high-dimensional Bayesian posterior of the weak
lensing mass-mapping problem, and relying on simulations for defining a fully
non-Gaussian prior. We aim to demonstrate the accuracy of the method on
simulations, and then proceed to applying it to the mass reconstruction of the
HST/ACS COSMOS field. The proposed methodology combines elements of Bayesian
statistics, analytic theory, and a recent class of Deep Generative Models based
on Neural Score Matching. This approach allows us to do the following: 1) Make
full use of analytic cosmological theory to constrain the 2pt statistics of the
solution. 2) Learn from cosmological simulations any differences between this
analytic prior and full simulations. 3) Obtain samples from the full Bayesian
posterior of the problem for robust Uncertainty Quantification. We demonstrate
the method on the $\kappa$TNG simulations and find that the posterior mean
significantly outperfoms previous methods (Kaiser-Squires, Wiener filter,
Sparsity priors) both on root-mean-square error and in terms of the Pearson
correlation. We further illustrate the interpretability of the recovered
posterior by establishing a close correlation between posterior convergence
values and SNR of clusters artificially introduced into a field. Finally, we
apply the method to the reconstruction of the HST/ACS COSMOS field and yield
the highest quality convergence map of this field to date.
Related papers
- A sparse PAC-Bayesian approach for high-dimensional quantile prediction [0.0]
This paper presents a novel probabilistic machine learning approach for high-dimensional quantile prediction.
It uses a pseudo-Bayesian framework with a scaled Student-t prior and Langevin Monte Carlo for efficient computation.
Its effectiveness is validated through simulations and real-world data, where it performs competitively against established frequentist and Bayesian techniques.
arXiv Detail & Related papers (2024-09-03T08:01:01Z) - von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Diffusion posterior sampling for simulation-based inference in tall data settings [53.17563688225137]
Simulation-based inference ( SBI) is capable of approximating the posterior distribution that relates input parameters to a given observation.
In this work, we consider a tall data extension in which multiple observations are available to better infer the parameters of the model.
We compare our method to recently proposed competing approaches on various numerical experiments and demonstrate its superiority in terms of numerical stability and computational cost.
arXiv Detail & Related papers (2024-04-11T09:23:36Z) - Mixed Matrix Completion in Complex Survey Sampling under Heterogeneous
Missingness [6.278498348219109]
We propose a fast and scalable estimation algorithm that achieves sublinear convergence.
The proposed method is applied to analyze the National Health and Nutrition Examination Survey data.
arXiv Detail & Related papers (2024-02-06T12:26:58Z) - Curvature-Independent Last-Iterate Convergence for Games on Riemannian
Manifolds [77.4346324549323]
We show that a step size agnostic to the curvature of the manifold achieves a curvature-independent and linear last-iterate convergence rate.
To the best of our knowledge, the possibility of curvature-independent rates and/or last-iterate convergence has not been considered before.
arXiv Detail & Related papers (2023-06-29T01:20:44Z) - Decomposed Diffusion Sampler for Accelerating Large-Scale Inverse
Problems [64.29491112653905]
We propose a novel and efficient diffusion sampling strategy that synergistically combines the diffusion sampling and Krylov subspace methods.
Specifically, we prove that if tangent space at a denoised sample by Tweedie's formula forms a Krylov subspace, then the CG with the denoised data ensures the data consistency update to remain in the tangent space.
Our proposed method achieves more than 80 times faster inference time than the previous state-of-the-art method.
arXiv Detail & Related papers (2023-03-10T07:42:49Z) - Variational Laplace Autoencoders [53.08170674326728]
Variational autoencoders employ an amortized inference model to approximate the posterior of latent variables.
We present a novel approach that addresses the limited posterior expressiveness of fully-factorized Gaussian assumption.
We also present a general framework named Variational Laplace Autoencoders (VLAEs) for training deep generative models.
arXiv Detail & Related papers (2022-11-30T18:59:27Z) - Inverting brain grey matter models with likelihood-free inference: a
tool for trustable cytoarchitecture measurements [62.997667081978825]
characterisation of the brain grey matter cytoarchitecture with quantitative sensitivity to soma density and volume remains an unsolved challenge in dMRI.
We propose a new forward model, specifically a new system of equations, requiring a few relatively sparse b-shells.
We then apply modern tools from Bayesian analysis known as likelihood-free inference (LFI) to invert our proposed model.
arXiv Detail & Related papers (2021-11-15T09:08:27Z) - A survey on Bayesian inference for Gaussian mixture model [3.109306676759862]
The aim of this survey is to give a self-contained introduction to concepts and mathematical tools in Bayesian inference for finite and infinite Gaussian mixture model.
Other than this modest background, the development is self-contained, with rigorous proofs provided throughout.
arXiv Detail & Related papers (2021-08-20T13:23:17Z) - Posterior-Aided Regularization for Likelihood-Free Inference [23.708122045184698]
Posterior-Aided Regularization (PAR) is applicable to learning the density estimator, regardless of the model structure.
We provide a unified estimation method of PAR to estimate both reverse KL term and mutual information term with a single neural network.
arXiv Detail & Related papers (2021-02-15T16:59:30Z) - Towards constraining warm dark matter with stellar streams through
neural simulation-based inference [7.608718235345664]
We introduce a likelihood-free Bayesian inference pipeline based on Amortised Approximate Likelihood Ratios (AALR)
We apply the method to the simplified case where stellar streams are only perturbed by dark matter subhaloes.
arXiv Detail & Related papers (2020-11-30T15:53:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.