An interpretation of the Brownian bridge as a physics-informed prior for the Poisson equation
- URL: http://arxiv.org/abs/2503.00213v1
- Date: Fri, 28 Feb 2025 21:57:10 GMT
- Title: An interpretation of the Brownian bridge as a physics-informed prior for the Poisson equation
- Authors: Alex Alberts, Ilias Bilionis,
- Abstract summary: We show that Brownian bridge Gaussian processes can be viewed as a softly-enforced physics-constrained prior for the Poisson equation.<n>This connection allows us to probe different theoretical questions, such as convergence and behavior of inverse problems.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Physics-informed machine learning is one of the most commonly used methods for fusing physical knowledge in the form of partial differential equations with experimental data. The idea is to construct a loss function where the physical laws take the place of a regularizer and minimize it to reconstruct the underlying physical fields and any missing parameters. However, there is a noticeable lack of a direct connection between physics-informed loss functions and an overarching Bayesian framework. In this work, we demonstrate that Brownian bridge Gaussian processes can be viewed as a softly-enforced physics-constrained prior for the Poisson equation. We first show equivalence between the variational form of the physics-informed loss function for the Poisson equation and a kernel ridge regression objective. Then, through the connection between Gaussian process regression and kernel methods, we identify a Gaussian process for which the posterior mean function and physics-informed loss function minimizer agree. This connection allows us to probe different theoretical questions, such as convergence and behavior of inverse problems. We also connect the method to the important problem of identifying model-form error in applications.
Related papers
- Bayesian Model Parameter Learning in Linear Inverse Problems with Application in EEG Focal Source Imaging [49.1574468325115]
Inverse problems can be described as limited-data problems in which the signal of interest cannot be observed directly.
We studied a linear inverse problem that included an unknown non-linear model parameter.
We utilized a Bayesian model-based learning approach that allowed signal recovery and subsequently estimation of the model parameter.
arXiv Detail & Related papers (2025-01-07T18:14:24Z) - Physics-informed machine learning as a kernel method [7.755962782612672]
We consider a general regression problem where the empirical risk is regularized by a partial differential equation.
Taking advantage of kernel theory, we derive convergence rates for the minimizer of the regularized risk.
We show that faster rates can be achieved, depending on the physical error.
arXiv Detail & Related papers (2024-02-12T09:38:42Z) - Fourier Neural Differential Equations for learning Quantum Field
Theories [57.11316818360655]
A Quantum Field Theory is defined by its interaction Hamiltonian, and linked to experimental data by the scattering matrix.
In this paper, NDE models are used to learn theory, Scalar-Yukawa theory and Scalar Quantum Electrodynamics.
The interaction Hamiltonian of a theory can be extracted from network parameters.
arXiv Detail & Related papers (2023-11-28T22:11:15Z) - About optimal loss function for training physics-informed neural
networks under respecting causality [0.0]
The advantage of using the modified problem for physics-informed neural networks (PINNs) methodology is that it becomes possible to represent the loss function in the form of a single term associated with differential equations.
Numerical experiments have been carried out for a number of problems, demonstrating the accuracy of the proposed methods.
arXiv Detail & Related papers (2023-04-05T08:10:40Z) - Physics-informed Information Field Theory for Modeling Physical Systems with Uncertainty Quantification [0.0]
Information field theory (IFT) provides the tools necessary to perform statistics over fields that are not necessarily Gaussian.
We extend IFT to physics-informed IFT (PIFT) by encoding the functional priors with information about the physical laws which describe the field.
The posteriors derived from this PIFT remain independent of any numerical scheme and can capture multiple modes.
We numerically demonstrate that the method correctly identifies when the physics cannot be trusted, in which case it automatically treats learning the field as a regression problem.
arXiv Detail & Related papers (2023-01-18T15:40:19Z) - Symbolic Recovery of Differential Equations: The Identifiability Problem [52.158782751264205]
Symbolic recovery of differential equations is the ambitious attempt at automating the derivation of governing equations.
We provide both necessary and sufficient conditions for a function to uniquely determine the corresponding differential equation.
We then use our results to devise numerical algorithms aiming to determine whether a function solves a differential equation uniquely.
arXiv Detail & Related papers (2022-10-15T17:32:49Z) - D-CIPHER: Discovery of Closed-form Partial Differential Equations [80.46395274587098]
We propose D-CIPHER, which is robust to measurement artifacts and can uncover a new and very general class of differential equations.
We further design a novel optimization procedure, CoLLie, to help D-CIPHER search through this class efficiently.
arXiv Detail & Related papers (2022-06-21T17:59:20Z) - Physics Informed RNN-DCT Networks for Time-Dependent Partial
Differential Equations [62.81701992551728]
We present a physics-informed framework for solving time-dependent partial differential equations.
Our model utilizes discrete cosine transforms to encode spatial and recurrent neural networks.
We show experimental results on the Taylor-Green vortex solution to the Navier-Stokes equations.
arXiv Detail & Related papers (2022-02-24T20:46:52Z) - Conditional physics informed neural networks [85.48030573849712]
We introduce conditional PINNs (physics informed neural networks) for estimating the solution of classes of eigenvalue problems.
We show that a single deep neural network can learn the solution of partial differential equations for an entire class of problems.
arXiv Detail & Related papers (2021-04-06T18:29:14Z) - A Combined Data-driven and Physics-driven Method for Steady Heat
Conduction Prediction using Deep Convolutional Neural Networks [39.46616349629182]
We propose a combined-driven method for learning acceleration and more accurate solutions.
For the data-driven based method, the introduction of physical equation not only is able to speed up the convergence, but also produces physically more consistent solutions.
For the physics-driven based method, it is observed that the combined method is able to speed up the convergence up to 49.0% by using a not very restrictive coarse reference.
arXiv Detail & Related papers (2020-05-16T22:29:37Z) - Physics Informed Deep Learning for Transport in Porous Media. Buckley
Leverett Problem [0.0]
We present a new hybrid physics-based machine-learning approach to reservoir modeling.
The methodology relies on a series of deep adversarial neural network architecture with physics-based regularization.
The proposed methodology is a simple and elegant way to instill physical knowledge to machine-learning algorithms.
arXiv Detail & Related papers (2020-01-15T08:20:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.