Learning from Topology: Cosmological Parameter Estimation from the Large-scale Structure
- URL: http://arxiv.org/abs/2308.02636v2
- Date: Wed, 04 Jun 2025 23:52:35 GMT
- Title: Learning from Topology: Cosmological Parameter Estimation from the Large-scale Structure
- Authors: Jacky H. T. Yip, Adam Rouhiainen, Gary Shiu,
- Abstract summary: We propose a neural network model to map persistence images to cosmological parameters.<n>Our model makes accurate and precise estimates, considerably outperforming conventional Bayesian inference approaches.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The topology of the large-scale structure of the universe contains valuable information on the underlying cosmological parameters. While persistent homology can extract this topological information, the optimal method for parameter estimation from the tool remains an open question. To address this, we propose a neural network model to map persistence images to cosmological parameters. Through a parameter recovery test, we demonstrate that our model makes accurate and precise estimates, considerably outperforming conventional Bayesian inference approaches.
Related papers
- Topological Sensing in the Dynamics of Quantum Walks with Defects [4.109098801756255]
We propose a sensing protocol that exploits the dynamics of topological quantum walks incorporating localized defects.<n>By utilizing topologically nontrivial properties of the quantum walks, the sensing precision can approach the Heisenberg limit.<n>Our results show that this approach maintains high precision over a broad range of parameters and exhibits strong robustness against disorder.
arXiv Detail & Related papers (2026-01-07T11:30:07Z) - Bayesian Inference of Primordial Magnetic Field Parameters from CMB with Spherical Graph Neural Networks [0.0]
This paper implements a novel Bayesian graph deep learning framework for estimating key cosmological parameters in a primordial magnetic field (PMF) cosmology directly from simulated Cosmic Microwave Background (CMB) maps.<n>Our methodology utilizes DeepSphere, a spherical convolutional neural network architecture specifically designed to respect the spherical geometry of CMB data through HEALPix pixelization.
arXiv Detail & Related papers (2025-10-23T17:56:04Z) - Topological finite size effect in one-dimensional chiral symmetric systems [0.0]
We propose a new criterion for characterizing finite topological systems based on the bulk conductivity of topological edge modes.
We show that our approach offers practical insights for topology determination in contemporary intermediate scale experimental applications.
arXiv Detail & Related papers (2024-11-26T19:02:39Z) - (Deep) Generative Geodesics [57.635187092922976]
We introduce a newian metric to assess the similarity between any two data points.
Our metric leads to the conceptual definition of generative distances and generative geodesics.
Their approximations are proven to converge to their true values under mild conditions.
arXiv Detail & Related papers (2024-07-15T21:14:02Z) - Estimation of spatio-temporal extremes via generative neural networks [0.0]
We provide a unified approach for analyzing spatial extremes with little available data.
By employing recent developments in generative neural networks we predict a full sample-based distribution.
We validate our method by fitting several simulated max-stable processes, showing a high accuracy of the approach.
arXiv Detail & Related papers (2024-07-11T16:57:17Z) - Cosmological Field Emulation and Parameter Inference with Diffusion
Models [2.3020018305241337]
We leverage diffusion generative models to address two tasks of importance to cosmology.
We show that the model is able to generate fields with power spectra consistent with those of the simulated target distribution.
We additionally explore their utility as parameter inference models and find that we can obtain tight constraints on cosmological parameters.
arXiv Detail & Related papers (2023-12-12T18:58:42Z) - Should We Learn Most Likely Functions or Parameters? [51.133793272222874]
We investigate the benefits and drawbacks of directly estimating the most likely function implied by the model and the data.
We find that function-space MAP estimation can lead to flatter minima, better generalization, and improved to overfitting.
arXiv Detail & Related papers (2023-11-27T16:39:55Z) - Inferring Structural Parameters of Low-Surface-Brightness-Galaxies with
Uncertainty Quantification using Bayesian Neural Networks [70.80563014913676]
We show that a Bayesian Neural Network (BNN) can be used for the inference, with uncertainty, of such parameters from simulated low-surface-brightness galaxy images.
Compared to traditional profile-fitting methods, we show that the uncertainties obtained using BNNs are comparable in magnitude, well-calibrated, and the point estimates of the parameters are closer to the true values.
arXiv Detail & Related papers (2022-07-07T17:55:26Z) - Constraining cosmological parameters from N-body simulations with
Bayesian Neural Networks [0.0]
We use The Quijote simulations in order to extract the cosmological parameters through Bayesian Neural Networks.
This kind of model has a remarkable ability to estimate the associated uncertainty, which is one of the ultimate goals in the precision cosmology era.
arXiv Detail & Related papers (2021-12-22T13:22:30Z) - Arbitrary Marginal Neural Ratio Estimation for Simulation-based
Inference [7.888755225607877]
We present a novel method that enables amortized inference over arbitrary subsets of the parameters, without resorting to numerical integration.
We demonstrate the applicability of the method on parameter inference of binary black hole systems from gravitational waves observations.
arXiv Detail & Related papers (2021-10-01T14:35:46Z) - Post-mortem on a deep learning contest: a Simpson's paradox and the
complementary roles of scale metrics versus shape metrics [61.49826776409194]
We analyze a corpus of models made publicly-available for a contest to predict the generalization accuracy of neural network (NN) models.
We identify what amounts to a Simpson's paradox: where "scale" metrics perform well overall but perform poorly on sub partitions of the data.
We present two novel shape metrics, one data-independent, and the other data-dependent, which can predict trends in the test accuracy of a series of NNs.
arXiv Detail & Related papers (2021-06-01T19:19:49Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - On the Sparsity of Neural Machine Translation Models [65.49762428553345]
We investigate whether redundant parameters can be reused to achieve better performance.
Experiments and analyses are systematically conducted on different datasets and NMT architectures.
arXiv Detail & Related papers (2020-10-06T11:47:20Z) - A Geometric Modeling of Occam's Razor in Deep Learning [8.007631014276896]
deep neural networks (DNNs) benefit from very high dimensional parameter spaces.
Their huge parameter complexities vs. stunning performances in practice is all the more intriguing and not explainable.
We propose a geometrically flavored information-theoretic approach to study this phenomenon.
arXiv Detail & Related papers (2019-05-27T07:57:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.