Addressing prior dependence in hierarchical Bayesian modeling for PTA data analysis II: Noise and SGWB inference through parameter decorrelation
- URL: http://arxiv.org/abs/2511.01959v1
- Date: Mon, 03 Nov 2025 17:54:55 GMT
- Title: Addressing prior dependence in hierarchical Bayesian modeling for PTA data analysis II: Noise and SGWB inference through parameter decorrelation
- Authors: Eleonora Villa, Luigi D'Amico, Aldo Barca, Fatima Modica Bittordo, Francesco Alì, Massimo Meneghetti, Luca Naso,
- Abstract summary: PTA analyses assign fixed uniform noise priors to each pulsar, an approach that can introduce systematic biases when combining the array.<n>We adopt a hierarchical Bayesian modeling strategy in which noise priors are parametrized by higher-level hyper parameters.<n>We show that the hierarchical treatment constrains the noise parameters more tightly and partially alleviates the red-noise-SGWB degeneracy.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Pulsar Timing Arrays provide a powerful framework to measure low-frequency gravitational waves, but accuracy and robustness of the results are challenged by complex noise processes that must be accurately modeled. Standard PTA analyses assign fixed uniform noise priors to each pulsar, an approach that can introduce systematic biases when combining the array. To overcome this limitation, we adopt a hierarchical Bayesian modeling strategy in which noise priors are parametrized by higher-level hyperparameters. We further address the challenge posed by the correlations between hyperparameters and physical noise parameters, focusing on those describing red noise and dispersion measure variations. To decorrelate these quantities, we introduce an orthogonal reparametrization of the hierarchical model implemented with Normalizing Flows. We also employ i-nessai, a flow-guided nested sampler, to efficiently explore the resulting higher-dimensional parameter space. We apply our method to a minimal 3-pulsar case study, performing a simultaneous inference of noise and SGWB parameters. Despite the limited dataset, the results consistently show that the hierarchical treatment constrains the noise parameters more tightly and partially alleviates the red-noise-SGWB degeneracy, while the orthogonal reparametrization further enhances parameter independence without affecting the correlations intrinsic to the power-law modeling of the physical processes involved.
Related papers
- Addressing prior dependence in hierarchical Bayesian modeling for PTA data analysis I: Methodology and implementation [0.0]
Complex inference tasks, such as those encountered in Pulsar Timing Array (PTA) data analysis, rely on Bayesian frameworks.<n>The high-dimensional parameter space and the strong interdependencies among astrophysical, pulsar noise, and nuisance parameters introduce significant challenges for efficient learning and robust inference.<n>We address these issues in the framework of hierarchical Bayesian modeling by introducing a re parameterization strategy.
arXiv Detail & Related papers (2025-11-05T17:33:44Z) - Mitigating the Noise Shift for Denoising Generative Models via Noise Awareness Guidance [54.88271057438763]
Noise Awareness Guidance (NAG) is a correction method that explicitly steers sampling trajectories to remain consistent with the pre-defined noise schedule.<n>NAG consistently mitigates noise shift and substantially improves the generation quality of mainstream diffusion models.
arXiv Detail & Related papers (2025-10-14T13:31:34Z) - Diffusion Models for Solving Inverse Problems via Posterior Sampling with Piecewise Guidance [52.705112811734566]
A novel diffusion-based framework is introduced for solving inverse problems using a piecewise guidance scheme.<n>The proposed method is problem-agnostic and readily adaptable to a variety of inverse problems.<n>The framework achieves a reduction in inference time of (25%) for inpainting with both random and center masks, and (23%) and (24%) for (4times) and (8times) super-resolution tasks.
arXiv Detail & Related papers (2025-07-22T19:35:14Z) - ROPO: Robust Preference Optimization for Large Language Models [59.10763211091664]
We propose an iterative alignment approach that integrates noise-tolerance and filtering of noisy samples without the aid of external models.
Experiments on three widely-used datasets with Mistral-7B and Llama-2-7B demonstrate that ROPO significantly outperforms existing preference alignment methods.
arXiv Detail & Related papers (2024-04-05T13:58:51Z) - Improve Noise Tolerance of Robust Loss via Noise-Awareness [60.34670515595074]
We propose a meta-learning method which is capable of adaptively learning a hyper parameter prediction function, called Noise-Aware-Robust-Loss-Adjuster (NARL-Adjuster for brevity)
Four SOTA robust loss functions are attempted to be integrated with our algorithm, and comprehensive experiments substantiate the general availability and effectiveness of the proposed method in both its noise tolerance and performance.
arXiv Detail & Related papers (2023-01-18T04:54:58Z) - Tradeoffs between convergence rate and noise amplification for momentum-based accelerated optimization algorithms [8.669461942767098]
We study momentum-based first-order optimization algorithms in which the iterations are subject to an additive white noise.
For strongly convex quadratic problems, we use the steady-state variance of the error in the optimization variable to quantify noise amplification.
We introduce two parameterized families of algorithms that strike a balance between noise amplification and settling time.
arXiv Detail & Related papers (2022-09-24T04:26:30Z) - Hierarchical model reduction driven by machine learning for parametric
advection-diffusion-reaction problems in the presence of noisy data [0.0]
We propose a new approach to generate a reliable reduced model for a parametric elliptic problem in the presence of noisy data.
We show that directional HiPOD looses in terms of accuracy when problem data are affected by noise.
We replace with Machine Learning fitting models which better discriminate relevant physical features in the data from irrelevant noise.
arXiv Detail & Related papers (2022-04-01T16:02:05Z) - Bayesian Optimisation for Robust Model Predictive Control under Model
Parameter Uncertainty [26.052368583196426]
We propose an adaptive optimisation approach for tuning model predictive control (MPC) hyper- parameters.
We develop a Bayesian optimisation (BO) algorithm with a heteroscedastic noise model to deal with varying noise.
Experimental results demonstrate that our approach leads to higher cumulative rewards and more stable controllers.
arXiv Detail & Related papers (2022-03-01T15:33:21Z) - Noise Estimation for Generative Diffusion Models [91.22679787578438]
In this work, we present a simple and versatile learning scheme that can adjust the noise parameters for any given number of steps.
Our approach comes at a negligible computation cost.
arXiv Detail & Related papers (2021-04-06T15:46:16Z) - Shape Matters: Understanding the Implicit Bias of the Noise Covariance [76.54300276636982]
Noise in gradient descent provides a crucial implicit regularization effect for training over parameterized models.
We show that parameter-dependent noise -- induced by mini-batches or label perturbation -- is far more effective than Gaussian noise.
Our analysis reveals that parameter-dependent noise introduces a bias towards local minima with smaller noise variance, whereas spherical Gaussian noise does not.
arXiv Detail & Related papers (2020-06-15T18:31:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.