Data-driven informative priors for Bayesian inference with quasi-periodic data
- URL: http://arxiv.org/abs/2511.22296v1
- Date: Thu, 27 Nov 2025 10:21:52 GMT
- Title: Data-driven informative priors for Bayesian inference with quasi-periodic data
- Authors: Javier Lopez-Santiago, Luca Martino, Joaquin Miguez, Gonzalo Vazquez-Vilar,
- Abstract summary: We show that it is possible to construct a prior distribution from the data by fitting a Gaussian process (GP) with a periodic kernel.<n>We use an adaptive importance sampling method to approximate the posterior distribution of the hyper parameter corresponding to the period in the kernel.<n>We then use the marginal posterior distribution of the hyper parameter related to the periodicity in order to construct a prior distribution for the period of the parametric model.
- Score: 3.5465353320225113
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Bayesian computational strategies for inference can be inefficient in approximating the posterior distribution in models that exhibit some form of periodicity. This is because the probability mass of the marginal posterior distribution of the parameter representing the period is usually highly concentrated in a very small region of the parameter space. Therefore, it is necessary to provide as much information as possible to the inference method through the parameter prior distribution. We intend to show that it is possible to construct a prior distribution from the data by fitting a Gaussian process (GP) with a periodic kernel. More specifically, we want to show that it is possible to approximate the marginal posterior distribution of the hyperparameter corresponding to the period in the kernel. Subsequently, this distribution can be used as a prior distribution for the inference method. We use an adaptive importance sampling method to approximate the posterior distribution of the hyperparameters of the GP. Then, we use the marginal posterior distribution of the hyperparameter related to the periodicity in order to construct a prior distribution for the period of the parametric model. This workflow is empirical Bayes, implemented as a modular (cut) transfer of a GP posterior for the period to the parametric model. We applied the proposed methodology to both synthetic and real data. We approximated the posterior distribution of the period of the GP kernel and then passed it forward as a posterior-as-prior with no feedback. Finally, we analyzed its impact on the marginal posterior distribution.
Related papers
- Provable Diffusion Posterior Sampling for Bayesian Inversion [13.807494493914335]
This paper proposes a novel diffusion-based posterior sampling method within a plug-and-play framework.<n>To approximate the posterior score, we develop a Monte Carlo estimator in which particles are generated using Langevin dynamics.<n>On the theoretical side, we provide non-asymptotic error bounds, showing that the method converges even for complex multi-modal target posterior.
arXiv Detail & Related papers (2025-12-08T20:34:05Z) - Bayesian Inference for Left-Truncated Log-Logistic Distributions for Time-to-event Data Analysis [0.19999259391104385]
We propose a Bayesian approach for estimating the parameters of the left-truncated log-logistic (LTLL) distribution.<n>We show that it provides more stable and reliable parameter estimates, particularly when the likelihood surface is irregular due to left truncation.
arXiv Detail & Related papers (2025-06-21T23:10:07Z) - Bayesian Circular Regression with von Mises Quasi-Processes [57.88921637944379]
In this work we explore a family of expressive and interpretable distributions over circle-valued random functions.<n>For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Gibbs sampling.<n>We present experiments applying this model to the prediction of wind directions and the percentage of the running gait cycle as a function of joint angles.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Broadening Target Distributions for Accelerated Diffusion Models via a Novel Analysis Approach [49.97755400231656]
We show that a new accelerated DDPM sampler achieves accelerated performance for three broad distribution classes not considered before.<n>Our results show an improved dependency on the data dimension $d$ among accelerated DDPM type samplers.
arXiv Detail & Related papers (2024-02-21T16:11:47Z) - Joint Bayesian Inference of Graphical Structure and Parameters with a
Single Generative Flow Network [59.79008107609297]
We propose in this paper to approximate the joint posterior over the structure of a Bayesian Network.
We use a single GFlowNet whose sampling policy follows a two-phase process.
Since the parameters are included in the posterior distribution, this leaves more flexibility for the local probability models.
arXiv Detail & Related papers (2023-05-30T19:16:44Z) - Bayesian Renormalization [68.8204255655161]
We present a fully information theoretic approach to renormalization inspired by Bayesian statistical inference.
The main insight of Bayesian Renormalization is that the Fisher metric defines a correlation length that plays the role of an emergent RG scale.
We provide insight into how the Bayesian Renormalization scheme relates to existing methods for data compression and data generation.
arXiv Detail & Related papers (2023-05-17T18:00:28Z) - Wrapped Distributions on homogeneous Riemannian manifolds [58.720142291102135]
Control over distributions' properties, such as parameters, symmetry and modality yield a family of flexible distributions.
We empirically validate our approach by utilizing our proposed distributions within a variational autoencoder and a latent space network model.
arXiv Detail & Related papers (2022-04-20T21:25:21Z) - Non-parametric Kernel-Based Estimation of Probability Distributions for
Precipitation Modeling [0.0]
We derive non-parametric estimates of the cumulative distribution function (CDF) of precipitation amount for wet time intervals.
We show that KCDE provides better estimates of the probability distribution than the standard empirical (staircase) estimate.
arXiv Detail & Related papers (2021-09-21T04:52:00Z) - Instance-Optimal Compressed Sensing via Posterior Sampling [101.43899352984774]
We show for Gaussian measurements and emphany prior distribution on the signal, that the posterior sampling estimator achieves near-optimal recovery guarantees.
We implement the posterior sampling estimator for deep generative priors using Langevin dynamics, and empirically find that it produces accurate estimates with more diversity than MAP.
arXiv Detail & Related papers (2021-06-21T22:51:56Z) - Parametrization invariant interpretation of priors and posteriors [0.0]
We move away from the idea that "a prior distribution establishes a probability distribution over the parameters of our model" to the idea that "a prior distribution establishes a probability distribution over probability distributions"
Under this mindset, any distribution over probability distributions should be "intrinsic", that is, invariant to the specific parametrization which is selected for the manifold.
arXiv Detail & Related papers (2021-05-18T06:45:05Z) - Meta-Learning Conjugate Priors for Few-Shot Bayesian Optimization [0.0]
We propose a novel approach to utilize meta-learning to automate the estimation of informative conjugate prior distributions.
From this process we generate priors that require only few data to estimate the shape parameters of the original distribution of the data.
arXiv Detail & Related papers (2021-01-03T23:58:32Z) - The Bayesian Method of Tensor Networks [1.7894377200944511]
We study the Bayesian framework of the Network from two perspective.
We study the Bayesian properties of the Network by visualizing the parameters of the model and the decision boundaries in the two dimensional synthetic data set.
arXiv Detail & Related papers (2021-01-01T14:59:15Z) - Batch Stationary Distribution Estimation [98.18201132095066]
We consider the problem of approximating the stationary distribution of an ergodic Markov chain given a set of sampled transitions.
We propose a consistent estimator that is based on recovering a correction ratio function over the given data.
arXiv Detail & Related papers (2020-03-02T09:10:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.