Partition of Unity Physics-Informed Neural Networks (POU-PINNs): An Unsupervised Framework for Physics-Informed Domain Decomposition and Mixtures of Experts
- URL: http://arxiv.org/abs/2412.06842v1
- Date: Sat, 07 Dec 2024 16:07:43 GMT
- Title: Partition of Unity Physics-Informed Neural Networks (POU-PINNs): An Unsupervised Framework for Physics-Informed Domain Decomposition and Mixtures of Experts
- Authors: Arturo Rodriguez, Ashesh Chattopadhyay, Piyush Kumar, Luis F. Rodriguez, Vinod Kumar,
- Abstract summary: This study presents a novel unsupervised learning framework that identifies spatial inverse with specific governing physics.
A vital feature of this method is a physics residual-based loss function that detects variations in physical properties without requiring labeled data.
Its effectiveness is demonstrated through applications in porous thermal media ablation and ice-sheet modeling.
- Score: 6.179530974508392
- License:
- Abstract: Physics-informed neural networks (PINNs) commonly address ill-posed inverse problems by uncovering unknown physics. This study presents a novel unsupervised learning framework that identifies spatial subdomains with specific governing physics. It uses the partition of unity networks (POUs) to divide the space into subdomains, assigning unique nonlinear model parameters to each, which are integrated into the physics model. A vital feature of this method is a physics residual-based loss function that detects variations in physical properties without requiring labeled data. This approach enables the discovery of spatial decompositions and nonlinear parameters in partial differential equations (PDEs), optimizing the solution space by dividing it into subdomains and improving accuracy. Its effectiveness is demonstrated through applications in porous media thermal ablation and ice-sheet modeling, showcasing its potential for tackling real-world physics challenges.
Related papers
- Advancing Generalization in PINNs through Latent-Space Representations [71.86401914779019]
Physics-informed neural networks (PINNs) have made significant strides in modeling dynamical systems governed by partial differential equations (PDEs)
We propose PIDO, a novel physics-informed neural PDE solver designed to generalize effectively across diverse PDE configurations.
We validate PIDO on a range of benchmarks, including 1D combined equations and 2D Navier-Stokes equations.
arXiv Detail & Related papers (2024-11-28T13:16:20Z) - Neural Astrophysical Wind Models [0.0]
We show that deep neural networks embedded as individual terms in the governing coupled ordinary differential equations (ODEs) can robustly discover both of these physics.
We optimize a loss function based on the Mach number, rather than the explicitly solved-for 3 conserved variables, and apply a penalty term towards near-diverging solutions.
This work further highlights the feasibility of neural ODEs as a promising discovery tool with mechanistic interpretability for non-linear inverse problems.
arXiv Detail & Related papers (2023-06-20T16:37:57Z) - NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition [67.46012350241969]
This paper proposes a general acceleration methodology called NeuralStagger.
It decomposing the original learning tasks into several coarser-resolution subtasks.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid dynamics simulations.
arXiv Detail & Related papers (2023-02-20T19:36:52Z) - Understanding the Difficulty of Training Physics-Informed Neural
Networks on Dynamical Systems [5.878411350387833]
Physics-informed neural networks (PINNs) seamlessly integrate data and physical constraints into the solving of problems governed by differential equations.
We study the physics loss function in the vicinity of fixed points of dynamical systems.
We find that reducing the computational domain lowers the optimization complexity and chance of getting trapped with nonphysical solutions.
arXiv Detail & Related papers (2022-03-25T13:50:14Z) - Physics Informed RNN-DCT Networks for Time-Dependent Partial
Differential Equations [62.81701992551728]
We present a physics-informed framework for solving time-dependent partial differential equations.
Our model utilizes discrete cosine transforms to encode spatial and recurrent neural networks.
We show experimental results on the Taylor-Green vortex solution to the Navier-Stokes equations.
arXiv Detail & Related papers (2022-02-24T20:46:52Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Physics informed neural networks for continuum micromechanics [68.8204255655161]
Recently, physics informed neural networks have successfully been applied to a broad variety of problems in applied mathematics and engineering.
Due to the global approximation, physics informed neural networks have difficulties in displaying localized effects and strong non-linear solutions by optimization.
It is shown, that the domain decomposition approach is able to accurately resolve nonlinear stress, displacement and energy fields in heterogeneous microstructures obtained from real-world $mu$CT-scans.
arXiv Detail & Related papers (2021-10-14T14:05:19Z) - Learning in Sinusoidal Spaces with Physics-Informed Neural Networks [22.47355575565345]
A physics-informed neural network (PINN) uses physics-augmented loss functions to ensure its output is consistent with fundamental physics laws.
It turns out to be difficult to train an accurate PINN model for many problems in practice.
arXiv Detail & Related papers (2021-09-20T07:42:41Z) - AdjointNet: Constraining machine learning models with physics-based
codes [0.17205106391379021]
This paper proposes a physics constrained machine learning framework, AdjointNet, allowing domain scientists to embed their physics code in neural network training.
We show that the proposed AdjointNet framework can be used for parameter estimation (and uncertainty quantification by extension) and experimental design using active learning.
arXiv Detail & Related papers (2021-09-08T22:43:44Z) - Conditional physics informed neural networks [85.48030573849712]
We introduce conditional PINNs (physics informed neural networks) for estimating the solution of classes of eigenvalue problems.
We show that a single deep neural network can learn the solution of partial differential equations for an entire class of problems.
arXiv Detail & Related papers (2021-04-06T18:29:14Z) - Solving inverse-PDE problems with physics-aware neural networks [0.0]
We propose a novel framework to find unknown fields in the context of inverse problems for partial differential equations.
We blend the high expressibility of deep neural networks as universal function estimators with the accuracy and reliability of existing numerical algorithms.
arXiv Detail & Related papers (2020-01-10T18:46:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.