Addressing prior dependence in hierarchical Bayesian modeling for PTA data analysis I: Methodology and implementation
- URL: http://arxiv.org/abs/2511.03667v1
- Date: Wed, 05 Nov 2025 17:33:44 GMT
- Title: Addressing prior dependence in hierarchical Bayesian modeling for PTA data analysis I: Methodology and implementation
- Authors: Luigi D'amico, Eleonora Villa, Fatima Modica Bittordo, Aldo Barca, Francesco Alì, Massimo Meneghetti, Luca Naso,
- Abstract summary: Complex inference tasks, such as those encountered in Pulsar Timing Array (PTA) data analysis, rely on Bayesian frameworks.<n>The high-dimensional parameter space and the strong interdependencies among astrophysical, pulsar noise, and nuisance parameters introduce significant challenges for efficient learning and robust inference.<n>We address these issues in the framework of hierarchical Bayesian modeling by introducing a re parameterization strategy.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Complex inference tasks, such as those encountered in Pulsar Timing Array (PTA) data analysis, rely on Bayesian frameworks. The high-dimensional parameter space and the strong interdependencies among astrophysical, pulsar noise, and nuisance parameters introduce significant challenges for efficient learning and robust inference. These challenges are emblematic of broader issues in decision science, where model over-parameterization and prior sensitivity can compromise both computational tractability and the reliability of the results. We address these issues in the framework of hierarchical Bayesian modeling by introducing a reparameterization strategy. Our approach employs Normalizing Flows (NFs) to decorrelate the parameters governing hierarchical priors from those of astrophysical interest. The use of NF-based mappings provides both the flexibility to realize the reparametrization and the tractability to preserve proper probability densities. We further adopt i-nessai, a flow-guided nested sampler, to accelerate exploration of complex posteriors. This unified use of NFs improves statistical robustness and computational efficiency, providing a principled methodology for addressing hierarchical Bayesian inference in PTA analysis.
Related papers
- Towards A Unified PAC-Bayesian Framework for Norm-based Generalization Bounds [63.47271262149291]
We propose a unified framework for PAC-Bayesian norm-based generalization.<n>The key to our approach is a sensitivity matrix that quantifies the network outputs with respect to structured weight perturbations.<n>We derive a family of generalization bounds that recover several existing PAC-Bayesian results as special cases.
arXiv Detail & Related papers (2026-01-13T00:42:22Z) - Addressing prior dependence in hierarchical Bayesian modeling for PTA data analysis II: Noise and SGWB inference through parameter decorrelation [0.0]
PTA analyses assign fixed uniform noise priors to each pulsar, an approach that can introduce systematic biases when combining the array.<n>We adopt a hierarchical Bayesian modeling strategy in which noise priors are parametrized by higher-level hyper parameters.<n>We show that the hierarchical treatment constrains the noise parameters more tightly and partially alleviates the red-noise-SGWB degeneracy.
arXiv Detail & Related papers (2025-11-03T17:54:55Z) - Deep operator network for surrogate modeling of poroelasticity with random permeability fields [3.7214007898390196]
Poroelasticity -- coupled fluid flow and elastic deformation in porous media -- often involves spatially variable permeability.<n>In this study, we propose a surrogate modeling framework based on the deep operator network (DeepONet), a neural architecture designed to learn mappings between infinite-dimensional function spaces.<n>To enhance predictive accuracy and stability, we integrate three strategies: nondimensionalization of the governing equations, input dimensionality reduction via Karhunen--Lo'eve expansion, and a two-step training procedure that decouples the optimization of branch and trunk networks.
arXiv Detail & Related papers (2025-09-15T14:18:49Z) - Contrastive Normalizing Flows for Uncertainty-Aware Parameter Estimation [0.0]
Estimating physical parameters from data is a crucial application of machine learning (ML) in the physical sciences.<n>We introduce a novel approach based on Contrastive Normalizing Flows (CNFs), which achieves top performance on the HiggsML Uncertainty Challenge dataset.
arXiv Detail & Related papers (2025-05-13T16:14:34Z) - Partial Transportability for Domain Generalization [56.37032680901525]
Building on the theory of partial identification and transportability, this paper introduces new results for bounding the value of a functional of the target distribution.<n>Our contribution is to provide the first general estimation technique for transportability problems.<n>We propose a gradient-based optimization scheme for making scalable inferences in practice.
arXiv Detail & Related papers (2025-03-30T22:06:37Z) - Decentralized Inference for Spatial Data Using Low-Rank Models [4.168323530566095]
This paper presents a decentralized framework tailored for parameter inference in spatial low-rank models.<n>A key obstacle arises from the spatial dependence among observations, which prevents the log-likelihood from being expressed as a summation.<n>Our approach employs a block descent method integrated with multi-consensus and dynamic consensus averaging for effective parameter optimization.
arXiv Detail & Related papers (2025-02-01T04:17:01Z) - Ensemble Kalman Filtering Meets Gaussian Process SSM for Non-Mean-Field and Online Inference [47.460898983429374]
We introduce an ensemble Kalman filter (EnKF) into the non-mean-field (NMF) variational inference framework to approximate the posterior distribution of the latent states.
This novel marriage between EnKF and GPSSM not only eliminates the need for extensive parameterization in learning variational distributions, but also enables an interpretable, closed-form approximation of the evidence lower bound (ELBO)
We demonstrate that the resulting EnKF-aided online algorithm embodies a principled objective function by ensuring data-fitting accuracy while incorporating model regularizations to mitigate overfitting.
arXiv Detail & Related papers (2023-12-10T15:22:30Z) - A PAC-Bayesian Perspective on the Interpolating Information Criterion [54.548058449535155]
We show how a PAC-Bayes bound is obtained for a general class of models, characterizing factors which influence performance in the interpolating regime.
We quantify how the test error for overparameterized models achieving effectively zero training error depends on the quality of the implicit regularization imposed by e.g. the combination of model, parameter-initialization scheme.
arXiv Detail & Related papers (2023-11-13T01:48:08Z) - Validation Diagnostics for SBI algorithms based on Normalizing Flows [55.41644538483948]
This work proposes easy to interpret validation diagnostics for multi-dimensional conditional (posterior) density estimators based on NF.
It also offers theoretical guarantees based on results of local consistency.
This work should help the design of better specified models or drive the development of novel SBI-algorithms.
arXiv Detail & Related papers (2022-11-17T15:48:06Z) - Gaussian Process-based Min-norm Stabilizing Controller for
Control-Affine Systems with Uncertain Input Effects and Dynamics [90.81186513537777]
We propose a novel compound kernel that captures the control-affine nature of the problem.
We show that this resulting optimization problem is convex, and we call it Gaussian Process-based Control Lyapunov Function Second-Order Cone Program (GP-CLF-SOCP)
arXiv Detail & Related papers (2020-11-14T01:27:32Z) - An Asymptotically Optimal Multi-Armed Bandit Algorithm and
Hyperparameter Optimization [48.5614138038673]
We propose an efficient and robust bandit-based algorithm called Sub-Sampling (SS) in the scenario of hyper parameter search evaluation.
We also develop a novel hyper parameter optimization algorithm called BOSS.
Empirical studies validate our theoretical arguments of SS and demonstrate the superior performance of BOSS on a number of applications.
arXiv Detail & Related papers (2020-07-11T03:15:21Z) - Misspecification-robust likelihood-free inference in high dimensions [13.934999364767918]
We introduce an extension of the popular Bayesian optimisation based approach to approximate discrepancy functions in a probabilistic manner.<n>Our approach achieves computational scalability for higher dimensional parameter spaces by using separate acquisition functions and discrepancies for each parameter.<n>The method successfully performs computationally efficient inference in a 100-dimensional space on canonical examples and compares favourably to existing modularised ABC methods.
arXiv Detail & Related papers (2020-02-21T16:06:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.