Bayesian PINNs for uncertainty-aware inverse problems (BPINN-IP)
- URL: http://arxiv.org/abs/2602.04459v1
- Date: Wed, 04 Feb 2026 11:42:57 GMT
- Title: Bayesian PINNs for uncertainty-aware inverse problems (BPINN-IP)
- Authors: Ali Mohammad-Djafari,
- Abstract summary: The proposed methodology extends PINN to account for prior knowledge on the nature of the expected NN output, as well as its weights.<n> variational inference and Monte Carlo dropout are employed to provide predictive means and variances for reconstructed images.
- Score: 1.583842747998493
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The main contribution of this paper is to develop a hierarchical Bayesian formulation of PINNs for linear inverse problems, which is called BPINN-IP. The proposed methodology extends PINN to account for prior knowledge on the nature of the expected NN output, as well as its weights. Also, as we can have access to the posterior probability distributions, naturally uncertainties can be quantified. Also, variational inference and Monte Carlo dropout are employed to provide predictive means and variances for reconstructed images. Un example of applications to deconvolution and super-resolution is considered, details of the different steps of implementations are given, and some preliminary results are presented.
Related papers
- Improved Uncertainty Quantification in Physics-Informed Neural Networks Using Error Bounds and Solution Bundles [2.066173485843472]
We train Bayesian Neural Networks that provide uncertainties over the solutions to differential equation systems provided by PINNs.<n>We use available error bounds over PINNs to formulate a heteroscedastic variance that improves the uncertainty estimation.
arXiv Detail & Related papers (2025-05-09T22:40:39Z) - ProPINN: Demystifying Propagation Failures in Physics-Informed Neural Networks [71.02216400133858]
Physics-informed neural networks (PINNs) have earned high expectations in solving partial differential equations (PDEs)<n>Previous research observed the propagation failure phenomenon of PINNs.<n>This paper provides a formal and in-depth study of propagation failure and its root cause.
arXiv Detail & Related papers (2025-02-02T13:56:38Z) - Conformalized Physics-Informed Neural Networks [0.8437187555622164]
We introduce Conformalized PINNs (C-PINNs) to quantify the uncertainty of PINNs.
C-PINNs utilize the framework of conformal prediction to quantify the uncertainty of PINNs.
arXiv Detail & Related papers (2024-05-13T18:45:25Z) - Piecewise Deterministic Markov Processes for Bayesian Neural Networks [19.5426096350075]
Inference on modern Bayesian Neural Networks (BNNs) often relies on a variational inference treatment, imposing violated assumptions of independence and the form of the posterior.<n>New Piecewise Deterministic Markov Process (PDMP) samplers permit subsampling, though introduce a model specific inhomogenous Poisson Process (IPPs) which is difficult to sample from.<n>This work introduces a new generic and adaptive thinning scheme for sampling from IPPs, and demonstrates how this approach can accelerate the application of PDMPs for inference in BNNs.
arXiv Detail & Related papers (2023-02-17T06:38:16Z) - Failure-informed adaptive sampling for PINNs [5.723850818203907]
Physics-informed neural networks (PINNs) have emerged as an effective technique for solving PDEs in a wide range of domains.
Recent research has demonstrated, however, that the performance of PINNs can vary dramatically with different sampling procedures.
We present an adaptive approach termed failure-informed PINNs, which is inspired by the viewpoint of reliability analysis.
arXiv Detail & Related papers (2022-10-01T13:34:41Z) - Variational Neural Networks [88.24021148516319]
We propose a method for uncertainty estimation in neural networks called Variational Neural Network (VNN)
VNN generates parameters for the output distribution of a layer by transforming its inputs with learnable sub-layers.
In uncertainty quality estimation experiments, we show that VNNs achieve better uncertainty quality than Monte Carlo Dropout or Bayes By Backpropagation methods.
arXiv Detail & Related papers (2022-07-04T15:41:02Z) - NUQ: Nonparametric Uncertainty Quantification for Deterministic Neural
Networks [151.03112356092575]
We show the principled way to measure the uncertainty of predictions for a classifier based on Nadaraya-Watson's nonparametric estimate of the conditional label distribution.
We demonstrate the strong performance of the method in uncertainty estimation tasks on a variety of real-world image datasets.
arXiv Detail & Related papers (2022-02-07T12:30:45Z) - Sampling-free Variational Inference for Neural Networks with
Multiplicative Activation Noise [51.080620762639434]
We propose a more efficient parameterization of the posterior approximation for sampling-free variational inference.
Our approach yields competitive results for standard regression problems and scales well to large-scale image classification tasks.
arXiv Detail & Related papers (2021-03-15T16:16:18Z) - Estimates on the generalization error of Physics Informed Neural
Networks (PINNs) for approximating a class of inverse problems for PDEs [16.758334184623152]
We focus on a particular class of inverse problems, the so-called data assimilation or unique continuation problems.
We prove rigorous estimates on the generalization error of PINNs approximating them.
arXiv Detail & Related papers (2020-06-29T16:23:58Z) - Unlabelled Data Improves Bayesian Uncertainty Calibration under
Covariate Shift [100.52588638477862]
We develop an approximate Bayesian inference scheme based on posterior regularisation.
We demonstrate the utility of our method in the context of transferring prognostic models of prostate cancer across globally diverse populations.
arXiv Detail & Related papers (2020-06-26T13:50:19Z) - Bayesian Deep Learning and a Probabilistic Perspective of Generalization [56.69671152009899]
We show that deep ensembles provide an effective mechanism for approximate Bayesian marginalization.
We also propose a related approach that further improves the predictive distribution by marginalizing within basins of attraction.
arXiv Detail & Related papers (2020-02-20T15:13:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.