AutoIP: A United Framework to Integrate Physics into Gaussian Processes
- URL: http://arxiv.org/abs/2202.12316v1
- Date: Thu, 24 Feb 2022 19:02:14 GMT
- Title: AutoIP: A United Framework to Integrate Physics into Gaussian Processes
- Authors: Da Long, Zheng Wang, Aditi Krishnapriyan, Robert Kirby, Shandian Zhe,
Michael Mahoney
- Abstract summary: We propose a framework that can integrate all kinds of differential equations into Gaussian processes.
Our method shows improvement upon vanilla GPs in both simulation and several real-world applications.
- Score: 15.108333340471034
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Physics modeling is critical for modern science and engineering applications.
From data science perspective, physics knowledge -- often expressed as
differential equations -- is valuable in that it is highly complementary to
data, and can potentially help overcome data sparsity, noise, inaccuracy, etc.
In this work, we propose a simple yet powerful framework that can integrate all
kinds of differential equations into Gaussian processes (GPs) to enhance
prediction accuracy and uncertainty quantification. These equations can be
linear, nonlinear, temporal, time-spatial, complete, incomplete with unknown
source terms, etc. Specifically, based on kernel differentiation, we construct
a GP prior to jointly sample the values of the target function,
equation-related derivatives, and latent source functions from a multivariate
Gaussian distribution. The sampled values are fed to two likelihoods -- one is
to fit the observations and the other to conform to the equation. We use the
whitening trick to evade the strong dependency between the sampled function
values and kernel parameters, and develop a stochastic variational learning
algorithm. Our method shows improvement upon vanilla GPs in both simulation and
several real-world applications, even using rough, incomplete equations.
Related papers
- Accelerated zero-order SGD under high-order smoothness and overparameterized regime [79.85163929026146]
We present a novel gradient-free algorithm to solve convex optimization problems.
Such problems are encountered in medicine, physics, and machine learning.
We provide convergence guarantees for the proposed algorithm under both types of noise.
arXiv Detail & Related papers (2024-11-21T10:26:17Z) - Equation Discovery with Bayesian Spike-and-Slab Priors and Efficient Kernels [57.46832672991433]
We propose a novel equation discovery method based on Kernel learning and BAyesian Spike-and-Slab priors (KBASS)
We use kernel regression to estimate the target function, which is flexible, expressive, and more robust to data sparsity and noises.
We develop an expectation-propagation expectation-maximization algorithm for efficient posterior inference and function estimation.
arXiv Detail & Related papers (2023-10-09T03:55:09Z) - On the Integration of Physics-Based Machine Learning with Hierarchical
Bayesian Modeling Techniques [0.0]
This paper proposes to embed mechanics-based models into the mean function of a Gaussian Process (GP) model and characterize potential discrepancies through kernel machines.
The stationarity of the kernel function is a difficult hurdle in the sequential processing of long data sets, resolved through hierarchical Bayesian techniques.
Using numerical and experimental examples, potential applications of the proposed method to structural dynamics inverse problems are demonstrated.
arXiv Detail & Related papers (2023-03-01T02:29:41Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Physics-informed Information Field Theory for Modeling Physical Systems with Uncertainty Quantification [0.0]
Information field theory (IFT) provides the tools necessary to perform statistics over fields that are not necessarily Gaussian.
We extend IFT to physics-informed IFT (PIFT) by encoding the functional priors with information about the physical laws which describe the field.
The posteriors derived from this PIFT remain independent of any numerical scheme and can capture multiple modes.
We numerically demonstrate that the method correctly identifies when the physics cannot be trusted, in which case it automatically treats learning the field as a regression problem.
arXiv Detail & Related papers (2023-01-18T15:40:19Z) - FaDIn: Fast Discretized Inference for Hawkes Processes with General
Parametric Kernels [82.53569355337586]
This work offers an efficient solution to temporal point processes inference using general parametric kernels with finite support.
The method's effectiveness is evaluated by modeling the occurrence of stimuli-induced patterns from brain signals recorded with magnetoencephalography (MEG)
Results show that the proposed approach leads to an improved estimation of pattern latency than the state-of-the-art.
arXiv Detail & Related papers (2022-10-10T12:35:02Z) - Adjoint-aided inference of Gaussian process driven differential
equations [0.8257490175399691]
We show how the adjoint of a linear system can be used to efficiently infer forcing functions modelled as GPs.
We demonstrate the approach on systems of both ordinary and partial differential equations.
arXiv Detail & Related papers (2022-02-09T17:35:14Z) - Data-driven learning of nonlocal models: from high-fidelity simulations
to constitutive laws [3.1196544696082613]
We show that machine learning can improve the accuracy of simulations of stress waves in one-dimensional composite materials.
We propose a data-driven technique to learn nonlocal laws for stress wave propagation models.
arXiv Detail & Related papers (2020-12-08T01:46:26Z) - Large-scale Neural Solvers for Partial Differential Equations [48.7576911714538]
Solving partial differential equations (PDE) is an indispensable part of many branches of science as many processes can be modelled in terms of PDEs.
Recent numerical solvers require manual discretization of the underlying equation as well as sophisticated, tailored code for distributed computing.
We examine the applicability of continuous, mesh-free neural solvers for partial differential equations, physics-informed neural networks (PINNs)
We discuss the accuracy of GatedPINN with respect to analytical solutions -- as well as state-of-the-art numerical solvers, such as spectral solvers.
arXiv Detail & Related papers (2020-09-08T13:26:51Z) - Physics Informed Deep Kernel Learning [24.033468062984458]
Physics Informed Deep Kernel Learning (PI-DKL) exploits physics knowledge represented by differential equations with latent sources.
For efficient and effective inference, we marginalize out the latent variables and derive a collapsed model evidence lower bound (ELBO)
arXiv Detail & Related papers (2020-06-08T22:43:31Z) - SLEIPNIR: Deterministic and Provably Accurate Feature Expansion for
Gaussian Process Regression with Derivatives [86.01677297601624]
We propose a novel approach for scaling GP regression with derivatives based on quadrature Fourier features.
We prove deterministic, non-asymptotic and exponentially fast decaying error bounds which apply for both the approximated kernel as well as the approximated posterior.
arXiv Detail & Related papers (2020-03-05T14:33:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.