Advancing Generalization in PINNs through Latent-Space Representations
- URL: http://arxiv.org/abs/2411.19125v1
- Date: Thu, 28 Nov 2024 13:16:20 GMT
- Title: Advancing Generalization in PINNs through Latent-Space Representations
- Authors: Honghui Wang, Yifan Pu, Shiji Song, Gao Huang,
- Abstract summary: Physics-informed neural networks (PINNs) have made significant strides in modeling dynamical systems governed by partial differential equations (PDEs)
We propose PIDO, a novel physics-informed neural PDE solver designed to generalize effectively across diverse PDE configurations.
We validate PIDO on a range of benchmarks, including 1D combined equations and 2D Navier-Stokes equations.
- Score: 71.86401914779019
- License:
- Abstract: Physics-informed neural networks (PINNs) have made significant strides in modeling dynamical systems governed by partial differential equations (PDEs). However, their generalization capabilities across varying scenarios remain limited. To overcome this limitation, we propose PIDO, a novel physics-informed neural PDE solver designed to generalize effectively across diverse PDE configurations, including varying initial conditions, PDE coefficients, and training time horizons. PIDO exploits the shared underlying structure of dynamical systems with different properties by projecting PDE solutions into a latent space using auto-decoding. It then learns the dynamics of these latent representations, conditioned on the PDE coefficients. Despite its promise, integrating latent dynamics models within a physics-informed framework poses challenges due to the optimization difficulties associated with physics-informed losses. To address these challenges, we introduce a novel approach that diagnoses and mitigates these issues within the latent space. This strategy employs straightforward yet effective regularization techniques, enhancing both the temporal extrapolation performance and the training stability of PIDO. We validate PIDO on a range of benchmarks, including 1D combined equations and 2D Navier-Stokes equations. Additionally, we demonstrate the transferability of its learned representations to downstream applications such as long-term integration and inverse problems.
Related papers
- Unisolver: PDE-Conditional Transformers Are Universal PDE Solvers [55.0876373185983]
We present the Universal PDE solver (Unisolver) capable of solving a wide scope of PDEs.
Our key finding is that a PDE solution is fundamentally under the control of a series of PDE components.
Unisolver achieves consistent state-of-the-art results on three challenging large-scale benchmarks.
arXiv Detail & Related papers (2024-05-27T15:34:35Z) - Pretraining Codomain Attention Neural Operators for Solving Multiphysics PDEs [85.40198664108624]
We propose Codomain Attention Neural Operator (CoDA-NO) to solve multiphysics problems with PDEs.
CoDA-NO tokenizes functions along the codomain or channel space, enabling self-supervised learning or pretraining of multiple PDE systems.
We find CoDA-NO to outperform existing methods by over 36% on complex downstream tasks with limited data.
arXiv Detail & Related papers (2024-03-19T08:56:20Z) - Deep Equilibrium Based Neural Operators for Steady-State PDEs [100.88355782126098]
We study the benefits of weight-tied neural network architectures for steady-state PDEs.
We propose FNO-DEQ, a deep equilibrium variant of the FNO architecture that directly solves for the solution of a steady-state PDE.
arXiv Detail & Related papers (2023-11-30T22:34:57Z) - LatentPINNs: Generative physics-informed neural networks via a latent
representation learning [0.0]
We introduce latentPINN, a framework that utilizes latent representations of the PDE parameters as additional (to the coordinates) inputs into PINNs.
We use a two-stage training scheme in which the first stage, we learn the latent representations for the distribution of PDE parameters.
In the second stage, we train a physics-informed neural network over inputs given by randomly drawn samples from the coordinate space within the solution domain.
arXiv Detail & Related papers (2023-05-11T16:54:17Z) - PIXEL: Physics-Informed Cell Representations for Fast and Accurate PDE
Solvers [4.1173475271436155]
We propose a new kind of data-driven PDEs solver, physics-informed cell representations (PIXEL)
PIXEL elegantly combines classical numerical methods and learning-based approaches.
We show that PIXEL achieves fast convergence speed and high accuracy.
arXiv Detail & Related papers (2022-07-26T10:46:56Z) - Mitigating Learning Complexity in Physics and Equality Constrained
Artificial Neural Networks [0.9137554315375919]
Physics-informed neural networks (PINNs) have been proposed to learn the solution of partial differential equations (PDE)
In PINNs, the residual form of the PDE of interest and its boundary conditions are lumped into a composite objective function as soft penalties.
Here, we show that this specific way of formulating the objective function is the source of severe limitations in the PINN approach when applied to different kinds of PDEs.
arXiv Detail & Related papers (2022-06-19T04:12:01Z) - Learning to Accelerate Partial Differential Equations via Latent Global
Evolution [64.72624347511498]
Latent Evolution of PDEs (LE-PDE) is a simple, fast and scalable method to accelerate the simulation and inverse optimization of PDEs.
We introduce new learning objectives to effectively learn such latent dynamics to ensure long-term stability.
We demonstrate up to 128x reduction in the dimensions to update, and up to 15x improvement in speed, while achieving competitive accuracy.
arXiv Detail & Related papers (2022-06-15T17:31:24Z) - Multi-resolution partial differential equations preserved learning
framework for spatiotemporal dynamics [11.981731023317945]
Physics-informed deep learning (PiDL) addresses these challenges by incorporating physical principles into the model.
We propose to leverage physics prior knowledge by baking'' the discretized governing equations into the neural network architecture.
This method, embedding discretized PDEs through convolutional residual networks in a multi-resolution setting, largely improves the generalizability and long-term prediction.
arXiv Detail & Related papers (2022-05-09T01:27:58Z) - Encoding physics to learn reaction-diffusion processes [18.187800601192787]
We show how a deep learning framework that encodes given physics structure can be applied to a variety of problems regarding the PDE system regimes.
The resultant learning paradigm that encodes physics shows high accuracy, robustness, interpretability and generalizability demonstrated via extensive numerical experiments.
arXiv Detail & Related papers (2021-06-09T03:02:20Z) - dNNsolve: an efficient NN-based PDE solver [62.997667081978825]
We introduce dNNsolve, that makes use of dual Neural Networks to solve ODEs/PDEs.
We show that dNNsolve is capable of solving a broad range of ODEs/PDEs in 1, 2 and 3 spacetime dimensions.
arXiv Detail & Related papers (2021-03-15T19:14:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.