Truly Mesh-free Physics-Informed Neural Networks
- URL: http://arxiv.org/abs/2206.01545v1
- Date: Fri, 3 Jun 2022 12:45:47 GMT
- Title: Truly Mesh-free Physics-Informed Neural Networks
- Authors: Fabricio Arend Torres, Marcello Massimo Negri, Monika Nagy-Huber,
Maxim Samarin, Volker Roth
- Abstract summary: Physics-informed Neural Networks (PINNs) have recently emerged as a principled way to include prior physical knowledge in form of partial differential equations (PDEs) into neural networks.
We present a mesh-free and adaptive approach termed particle-density PINN (pdPINN) which is inspired by the microscopic viewpoint of fluid dynamics.
- Score: 3.5611181253285253
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Physics-informed Neural Networks (PINNs) have recently emerged as a
principled way to include prior physical knowledge in form of partial
differential equations (PDEs) into neural networks. Although generally viewed
as being mesh-free, current approaches still rely on collocation points
obtained within a bounded region, even in settings with spatially sparse
signals. Furthermore, if the boundaries are not known, the selection of such a
region may be arbitrary, resulting in a large proportion of collocation points
being selected in areas of low relevance. To resolve this, we present a
mesh-free and adaptive approach termed particle-density PINN (pdPINN), which is
inspired by the microscopic viewpoint of fluid dynamics. Instead of sampling
from a bounded region, we propose to sample directly from the distribution over
the (fluids) particle positions, eliminating the need to introduce boundaries
while adaptively focusing on the most relevant regions. This is achieved by
reformulating the modeled fluid density as an unnormalized probability
distribution from which we sample with dynamic Monte Carlo methods. We further
generalize pdPINNs to different settings that allow interpreting a positive
scalar quantity as a particle density, such as the evolution of the temperature
in the heat equation. The utility of our approach is demonstrated on
experiments for modeling (non-steady) compressible fluids in up to three
dimensions and a two-dimensional diffusion problem, illustrating the high
flexibility and sample efficiency compared to existing refinement methods for
PINNs.
Related papers
- Dynamical Measure Transport and Neural PDE Solvers for Sampling [77.38204731939273]
We tackle the task of sampling from a probability density as transporting a tractable density function to the target.
We employ physics-informed neural networks (PINNs) to approximate the respective partial differential equations (PDEs) solutions.
PINNs allow for simulation- and discretization-free optimization and can be trained very efficiently.
arXiv Detail & Related papers (2024-07-10T17:39:50Z) - Sampling in Unit Time with Kernel Fisher-Rao Flow [0.0]
We introduce a new mean-field ODE and corresponding interacting particle systems (IPS) for sampling from an unnormalized target density.
The IPS are gradient-free, available in closed form, and only require the ability to sample from a reference density and compute the (unnormalized) target-to-reference density ratio.
arXiv Detail & Related papers (2024-01-08T13:43:56Z) - Online Neural Path Guiding with Normalized Anisotropic Spherical
Gaussians [20.68953631807367]
We propose a novel online framework to learn the spatial-varying density model with a single small neural network.
Our framework learns the distribution in a progressive manner and does not need any warm-up phases.
arXiv Detail & Related papers (2023-03-11T05:22:42Z) - Monte Carlo Neural PDE Solver for Learning PDEs via Probabilistic Representation [59.45669299295436]
We propose a Monte Carlo PDE solver for training unsupervised neural solvers.
We use the PDEs' probabilistic representation, which regards macroscopic phenomena as ensembles of random particles.
Our experiments on convection-diffusion, Allen-Cahn, and Navier-Stokes equations demonstrate significant improvements in accuracy and efficiency.
arXiv Detail & Related papers (2023-02-10T08:05:19Z) - Sampling with Mollified Interaction Energy Descent [57.00583139477843]
We present a new optimization-based method for sampling called mollified interaction energy descent (MIED)
MIED minimizes a new class of energies on probability measures called mollified interaction energies (MIEs)
We show experimentally that for unconstrained sampling problems our algorithm performs on par with existing particle-based algorithms like SVGD.
arXiv Detail & Related papers (2022-10-24T16:54:18Z) - A DeepParticle method for learning and generating aggregation patterns
in multi-dimensional Keller-Segel chemotaxis systems [3.6184545598911724]
We study a regularized interacting particle method for computing aggregation patterns and near singular solutions of a Keller-Segal (KS) chemotaxis system in two and three space dimensions.
We further develop DeepParticle (DP) method to learn and generate solutions under variations of physical parameters.
arXiv Detail & Related papers (2022-08-31T20:52:01Z) - ManiFlow: Implicitly Representing Manifolds with Normalizing Flows [145.9820993054072]
Normalizing Flows (NFs) are flexible explicit generative models that have been shown to accurately model complex real-world data distributions.
We propose an optimization objective that recovers the most likely point on the manifold given a sample from the perturbed distribution.
Finally, we focus on 3D point clouds for which we utilize the explicit nature of NFs, i.e. surface normals extracted from the gradient of the log-likelihood and the log-likelihood itself.
arXiv Detail & Related papers (2022-08-18T16:07:59Z) - DeepParticle: learning invariant measure by a deep neural network
minimizing Wasserstein distance on data generated from an interacting
particle method [3.6310242206800667]
We introduce the so called DeepParticle method to learn and generate invariant measures of dynamical systems.
We use neural deep networks (DNNs) to represent the transform of samples from a given input (source) distribution to an arbitrary target distribution.
In training, we update the network weights to minimize a discrete Wasserstein distance between the input and target samples.
arXiv Detail & Related papers (2021-11-02T03:48:58Z) - Moser Flow: Divergence-based Generative Modeling on Manifolds [49.04974733536027]
Moser Flow (MF) is a new class of generative models within the family of continuous normalizing flows (CNF)
MF does not require invoking or backpropagating through an ODE solver during training.
We demonstrate for the first time the use of flow models for sampling from general curved surfaces.
arXiv Detail & Related papers (2021-08-18T09:00:24Z) - PU-Flow: a Point Cloud Upsampling Networkwith Normalizing Flows [58.96306192736593]
We present PU-Flow, which incorporates normalizing flows and feature techniques to produce dense points uniformly distributed on the underlying surface.
Specifically, we formulate the upsampling process as point in a latent space, where the weights are adaptively learned from local geometric context.
We show that our method outperforms state-of-the-art deep learning-based approaches in terms of reconstruction quality, proximity-to-surface accuracy, and computation efficiency.
arXiv Detail & Related papers (2021-07-13T07:45:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.