Random Grid Neural Processes for Parametric Partial Differential
Equations
- URL: http://arxiv.org/abs/2301.11040v2
- Date: Wed, 7 Jun 2023 08:50:15 GMT
- Title: Random Grid Neural Processes for Parametric Partial Differential
Equations
- Authors: Arnaud Vadeboncoeur, Ieva Kazlauskaite, Yanni Papandreou, Fehmi Cirak,
Mark Girolami, \"Omer Deniz Akyildiz
- Abstract summary: We introduce a new class of spatially probabilistic physics and data informed deep latent models for PDEs.
We solve forward and inverse problems for parametric PDEs in a way that leads to the construction of Gaussian process models of solution fields.
We show how to incorporate noisy data in a principled manner into our physics informed model to improve predictions for problems where data may be available.
- Score: 5.244037702157957
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce a new class of spatially stochastic physics and data informed
deep latent models for parametric partial differential equations (PDEs) which
operate through scalable variational neural processes. We achieve this by
assigning probability measures to the spatial domain, which allows us to treat
collocation grids probabilistically as random variables to be marginalised out.
Adapting this spatial statistics view, we solve forward and inverse problems
for parametric PDEs in a way that leads to the construction of Gaussian process
models of solution fields. The implementation of these random grids poses a
unique set of challenges for inverse physics informed deep learning frameworks
and we propose a new architecture called Grid Invariant Convolutional Networks
(GICNets) to overcome these challenges. We further show how to incorporate
noisy data in a principled manner into our physics informed model to improve
predictions for problems where data may be available but whose measurement
location does not coincide with any fixed mesh or grid. The proposed method is
tested on a nonlinear Poisson problem, Burgers equation, and Navier-Stokes
equations, and we provide extensive numerical comparisons. We demonstrate
significant computational advantages over current physics informed neural
learning methods for parametric PDEs while improving the predictive
capabilities and flexibility of these models.
Related papers
- Optimal Transport-Based Displacement Interpolation with Data Augmentation for Reduced Order Modeling of Nonlinear Dynamical Systems [0.0]
We present a novel reduced-order Model (ROM) that exploits optimal transport theory and displacement to enhance the representation of nonlinear dynamics in complex systems.
We show improved accuracy and efficiency in predicting complex system behaviors, indicating the potential of this approach for a wide range of applications in computational physics and engineering.
arXiv Detail & Related papers (2024-11-13T16:29:33Z) - FEM-based Neural Networks for Solving Incompressible Fluid Flows and Related Inverse Problems [41.94295877935867]
numerical simulation and optimization of technical systems described by partial differential equations is expensive.
A comparatively new approach in this context is to combine the good approximation properties of neural networks with the classical finite element method.
In this paper, we extend this approach to saddle-point and non-linear fluid dynamics problems, respectively.
arXiv Detail & Related papers (2024-09-06T07:17:01Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Monte Carlo Neural PDE Solver for Learning PDEs via Probabilistic Representation [59.45669299295436]
We propose a Monte Carlo PDE solver for training unsupervised neural solvers.
We use the PDEs' probabilistic representation, which regards macroscopic phenomena as ensembles of random particles.
Our experiments on convection-diffusion, Allen-Cahn, and Navier-Stokes equations demonstrate significant improvements in accuracy and efficiency.
arXiv Detail & Related papers (2023-02-10T08:05:19Z) - MAgNet: Mesh Agnostic Neural PDE Solver [68.8204255655161]
Climate predictions require fine-temporal resolutions to resolve all turbulent scales in the fluid simulations.
Current numerical model solveers PDEs on grids that are too coarse (3km to 200km on each side)
We design a novel architecture that predicts the spatially continuous solution of a PDE given a spatial position query.
arXiv Detail & Related papers (2022-10-11T14:52:20Z) - Optimizing differential equations to fit data and predict outcomes [0.0]
Recent technical advances in automatic differentiation through numerical differential equation solvers potentially change the fitting process into a relatively easy problem.
This article illustrates how to overcome a variety of common challenges, using the classic ecological data for oscillations in hare and lynx populations.
arXiv Detail & Related papers (2022-04-16T16:08:08Z) - Mixed Effects Neural ODE: A Variational Approximation for Analyzing the
Dynamics of Panel Data [50.23363975709122]
We propose a probabilistic model called ME-NODE to incorporate (fixed + random) mixed effects for analyzing panel data.
We show that our model can be derived using smooth approximations of SDEs provided by the Wong-Zakai theorem.
We then derive Evidence Based Lower Bounds for ME-NODE, and develop (efficient) training algorithms.
arXiv Detail & Related papers (2022-02-18T22:41:51Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Compositional Modeling of Nonlinear Dynamical Systems with ODE-based
Random Features [0.0]
We present a novel, domain-agnostic approach to tackling this problem.
We use compositions of physics-informed random features, derived from ordinary differential equations.
We find that our approach achieves comparable performance to a number of other probabilistic models on benchmark regression tasks.
arXiv Detail & Related papers (2021-06-10T17:55:13Z) - Stochastic analysis of heterogeneous porous material with modified
neural architecture search (NAS) based physics-informed neural networks using
transfer learning [0.0]
modified neural architecture search method (NAS) based physics-informed deep learning model is presented.
A three dimensional flow model is built to provide a benchmark to the simulation of groundwater flow in highly heterogeneous aquifers.
arXiv Detail & Related papers (2020-10-03T19:57:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.