PAGP: A physics-assisted Gaussian process framework with active learning
for forward and inverse problems of partial differential equations
- URL: http://arxiv.org/abs/2204.02583v1
- Date: Wed, 6 Apr 2022 05:08:01 GMT
- Title: PAGP: A physics-assisted Gaussian process framework with active learning
for forward and inverse problems of partial differential equations
- Authors: Jiahao Zhang, Shiqi Zhang, Guang Lin
- Abstract summary: We introduce three different models: continuous time, discrete time and hybrid models.
The given physical information is integrated into Gaussian process model through our designed GP loss functions.
In the last part, a novel hybrid model combining the continuous and discrete time models is presented.
- Score: 12.826754199680474
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this work, a Gaussian process regression(GPR) model incorporated with
given physical information in partial differential equations(PDEs) is
developed: physics-assisted Gaussian processes(PAGP). The targets of this model
can be divided into two types of problem: finding solutions or discovering
unknown coefficients of given PDEs with initial and boundary conditions. We
introduce three different models: continuous time, discrete time and hybrid
models. The given physical information is integrated into Gaussian process
model through our designed GP loss functions. Three types of loss function are
provided in this paper based on two different approaches to train the standard
GP model. The first part of the paper introduces the continuous time model
which treats temporal domain the same as spatial domain. The unknown
coefficients in given PDEs can be jointly learned with GP hyper-parameters by
minimizing the designed loss function. In the discrete time models, we first
choose a time discretization scheme to discretize the temporal domain. Then the
PAGP model is applied at each time step together with the scheme to approximate
PDE solutions at given test points of final time. To discover unknown
coefficients in this setting, observations at two specific time are needed and
a mixed mean square error function is constructed to obtain the optimal
coefficients. In the last part, a novel hybrid model combining the continuous
and discrete time models is presented. It merges the flexibility of continuous
time model and the accuracy of the discrete time model. The performance of
choosing different models with different GP loss functions is also discussed.
The effectiveness of the proposed PAGP methods is illustrated in our numerical
section.
Related papers
- Towards Efficient Time Stepping for Numerical Shape Correspondence [55.2480439325792]
Methods based on partial differential equations (PDEs) have been established, encompassing e.g. the classic heat kernel signature.
We consider here several time stepping schemes. The goal of this investigation is to assess, if one may identify a useful property of methods for time integration for the shape analysis context.
arXiv Detail & Related papers (2023-12-21T13:40:03Z) - Fully probabilistic deep models for forward and inverse problems in
parametric PDEs [1.9599274203282304]
We introduce a physics-driven deep latent variable model (PDDLVM) to learn simultaneously parameter-to-solution (forward) and solution-to- parameter (inverse) maps of PDEs.
The proposed framework can be easily extended to seamlessly integrate observed data to solve inverse problems and to build generative models.
We demonstrate the efficiency and robustness of our method on finite element discretized parametric PDE problems.
arXiv Detail & Related papers (2022-08-09T15:40:53Z) - Semi-supervised Learning of Partial Differential Operators and Dynamical
Flows [68.77595310155365]
We present a novel method that combines a hyper-network solver with a Fourier Neural Operator architecture.
We test our method on various time evolution PDEs, including nonlinear fluid flows in one, two, and three spatial dimensions.
The results show that the new method improves the learning accuracy at the time point of supervision point, and is able to interpolate and the solutions to any intermediate time.
arXiv Detail & Related papers (2022-07-28T19:59:14Z) - Neural Laplace: Learning diverse classes of differential equations in
the Laplace domain [86.52703093858631]
We propose a unified framework for learning diverse classes of differential equations (DEs) including all the aforementioned ones.
Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials.
In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs.
arXiv Detail & Related papers (2022-06-10T02:14:59Z) - Physics Informed RNN-DCT Networks for Time-Dependent Partial
Differential Equations [62.81701992551728]
We present a physics-informed framework for solving time-dependent partial differential equations.
Our model utilizes discrete cosine transforms to encode spatial and recurrent neural networks.
We show experimental results on the Taylor-Green vortex solution to the Navier-Stokes equations.
arXiv Detail & Related papers (2022-02-24T20:46:52Z) - Mixed Effects Neural ODE: A Variational Approximation for Analyzing the
Dynamics of Panel Data [50.23363975709122]
We propose a probabilistic model called ME-NODE to incorporate (fixed + random) mixed effects for analyzing panel data.
We show that our model can be derived using smooth approximations of SDEs provided by the Wong-Zakai theorem.
We then derive Evidence Based Lower Bounds for ME-NODE, and develop (efficient) training algorithms.
arXiv Detail & Related papers (2022-02-18T22:41:51Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Semi-Implicit Neural Solver for Time-dependent Partial Differential
Equations [4.246966726709308]
We propose a neural solver to learn an optimal iterative scheme in a data-driven fashion for any class of PDEs.
We provide theoretical guarantees for the correctness and convergence of neural solvers analogous to conventional iterative solvers.
arXiv Detail & Related papers (2021-09-03T12:03:10Z) - The temporal overfitting problem with applications in wind power curve
modeling [8.057262184815636]
We propose a new method to tackle the temporal overfitting problem.
Our specific application in this paper targets the power curve modeling in wind energy.
arXiv Detail & Related papers (2020-12-02T17:39:57Z) - Learning continuous-time PDEs from sparse data with graph neural
networks [10.259254824702555]
We propose a continuous-time differential model for dynamical systems whose governing equations are parameterized by message passing graph neural networks.
We demonstrate the model's ability to work with unstructured grids, arbitrary time steps, and noisy observations.
We compare our method with existing approaches on several well-known physical systems that involve first and higher-order PDEs with state-of-the-art predictive performance.
arXiv Detail & Related papers (2020-06-16T07:15:40Z) - Neural Controlled Differential Equations for Irregular Time Series [17.338923885534197]
An ordinary differential equation is determined by its initial condition, and there is no mechanism for adjusting the trajectory based on subsequent observations.
Here we demonstrate how this may be resolved through the well-understood mathematics of emphcontrolled differential equations
We show that our model achieves state-of-the-art performance against similar (ODE or RNN based) models in empirical studies on a range of datasets.
arXiv Detail & Related papers (2020-05-18T17:52:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.