Learning and discovering multiple solutions using physics-informed neural networks with random initialization and deep ensemble
- URL: http://arxiv.org/abs/2503.06320v1
- Date: Sat, 08 Mar 2025 19:32:22 GMT
- Title: Learning and discovering multiple solutions using physics-informed neural networks with random initialization and deep ensemble
- Authors: Zongren Zou, Zhicheng Wang, George Em Karniadakis,
- Abstract summary: We explore the capability of physics-informed neural networks (PINNs) to discover multiple solutions.<n> PINNs can effectively uncover multiple solutions to nonlinear ordinary and partial differential equations.<n>We propose utilizing PINN-generated solutions as initial conditions or initial guesses for conventional numerical solvers.
- Score: 10.047968926134363
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: We explore the capability of physics-informed neural networks (PINNs) to discover multiple solutions. Many real-world phenomena governed by nonlinear differential equations (DEs), such as fluid flow, exhibit multiple solutions under the same conditions, yet capturing this solution multiplicity remains a significant challenge. A key difficulty is giving appropriate initial conditions or initial guesses, to which the widely used time-marching schemes and Newton's iteration method are very sensitive in finding solutions for complex computational problems. While machine learning models, particularly PINNs, have shown promise in solving DEs, their ability to capture multiple solutions remains underexplored. In this work, we propose a simple and practical approach using PINNs to learn and discover multiple solutions. We first reveal that PINNs, when combined with random initialization and deep ensemble method -- originally developed for uncertainty quantification -- can effectively uncover multiple solutions to nonlinear ordinary and partial differential equations (ODEs/PDEs). Our approach highlights the critical role of initialization in shaping solution diversity, addressing an often-overlooked aspect of machine learning for scientific computing. Furthermore, we propose utilizing PINN-generated solutions as initial conditions or initial guesses for conventional numerical solvers to enhance accuracy and efficiency in capturing multiple solutions. Extensive numerical experiments, including the Allen-Cahn equation and cavity flow, where our approach successfully identifies both stable and unstable solutions, validate the effectiveness of our method. These findings establish a general and efficient framework for addressing solution multiplicity in nonlinear differential equations.
Related papers
- Neuro-Symbolic AI for Analytical Solutions of Differential Equations [11.177091143370466]
We present an approach to find analytical solutions of differential equations using a neuro-symbolic AI framework.
This integration unifies numerical and symbolic differential equation solvers via a neuro-symbolic AI framework.
We show advantages over commercial solvers, symbolic methods, and approximate neural networks on a diverse set of problems.
arXiv Detail & Related papers (2025-02-03T16:06:56Z) - A Block-Coordinate Approach of Multi-level Optimization with an
Application to Physics-Informed Neural Networks [0.0]
We propose a multi-level algorithm for the solution of nonlinear optimization problems and analyze its evaluation complexity.
We apply it to the solution of partial differential equations using physics-informed neural networks (PINNs) and show on a few test problems that the approach results in better solutions and significant computational savings.
arXiv Detail & Related papers (2023-05-23T19:12:02Z) - A Stable and Scalable Method for Solving Initial Value PDEs with Neural
Networks [52.5899851000193]
We develop an ODE based IVP solver which prevents the network from getting ill-conditioned and runs in time linear in the number of parameters.
We show that current methods based on this approach suffer from two key issues.
First, following the ODE produces an uncontrolled growth in the conditioning of the problem, ultimately leading to unacceptably large numerical errors.
arXiv Detail & Related papers (2023-04-28T17:28:18Z) - HomPINNs: homotopy physics-informed neural networks for solving the
inverse problems of nonlinear differential equations with multiple solutions [6.89453634946458]
We propose homotopy physics-informed neural networks (HomPINNs) to solve inverse problems of nonlinear differential equations (DEs)
The proposed framework begins with the use of NNs to simultaneously approximate unlabeled observations across diverse solutions while adhering to DE constraints.
Our findings demonstrate that the proposed method is scalable and adaptable, providing an effective solution for solving DEs with multiple solutions and unknown parameters.
arXiv Detail & Related papers (2023-04-06T01:20:23Z) - Improved Training of Physics-Informed Neural Networks with Model
Ensembles [81.38804205212425]
We propose to expand the solution interval gradually to make the PINN converge to the correct solution.
All ensemble members converge to the same solution in the vicinity of observed data.
We show experimentally that the proposed method can improve the accuracy of the found solution.
arXiv Detail & Related papers (2022-04-11T14:05:34Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Learning Proximal Operators to Discover Multiple Optima [66.98045013486794]
We present an end-to-end method to learn the proximal operator across non-family problems.
We show that for weakly-ized objectives and under mild conditions, the method converges globally.
arXiv Detail & Related papers (2022-01-28T05:53:28Z) - Characterizing possible failure modes in physics-informed neural
networks [55.83255669840384]
Recent work in scientific machine learning has developed so-called physics-informed neural network (PINN) models.
We demonstrate that, while existing PINN methodologies can learn good models for relatively trivial problems, they can easily fail to learn relevant physical phenomena even for simple PDEs.
We show that these possible failure modes are not due to the lack of expressivity in the NN architecture, but that the PINN's setup makes the loss landscape very hard to optimize.
arXiv Detail & Related papers (2021-09-02T16:06:45Z) - Conditional physics informed neural networks [85.48030573849712]
We introduce conditional PINNs (physics informed neural networks) for estimating the solution of classes of eigenvalue problems.
We show that a single deep neural network can learn the solution of partial differential equations for an entire class of problems.
arXiv Detail & Related papers (2021-04-06T18:29:14Z) - Solving Differential Equations Using Neural Network Solution Bundles [1.2891210250935146]
We propose a neural network be used as a solution bundle, a collection of solutions to an ODE for various initial states and system parameters.
The solution bundle exhibits fast, parallelizable evaluation of the system state, facilitating the use of Bayesian inference for parameter estimation.
arXiv Detail & Related papers (2020-06-17T02:44:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.