Conditional physics informed neural networks
- URL: http://arxiv.org/abs/2104.02741v1
- Date: Tue, 6 Apr 2021 18:29:14 GMT
- Title: Conditional physics informed neural networks
- Authors: Alexander Kovacs, Lukas Exl, Alexander Kornell, Johann Fischbacher,
Markus Hovorka, Markus Gusenbauer, Leoni Breth, Harald Oezelt, Masao Yano,
Noritsugu Sakuma, Akihito Kinoshita, Tetsuya Shoji, Akira Kato, Thomas
Schrefl
- Abstract summary: We introduce conditional PINNs (physics informed neural networks) for estimating the solution of classes of eigenvalue problems.
We show that a single deep neural network can learn the solution of partial differential equations for an entire class of problems.
- Score: 85.48030573849712
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce conditional PINNs (physics informed neural networks) for
estimating the solution of classes of eigenvalue problems. The concept of PINNs
is expanded to learn not only the solution of one particular differential
equation but the solutions to a class of problems. We demonstrate this idea by
estimating the coercive field of permanent magnets which depends on the width
and strength of local defects. When the neural network incorporates the physics
of magnetization reversal, training can be achieved in an unsupervised way.
There is no need to generate labeled training data. The presented test cases
have been rigorously studied in the past. Thus, a detailed and easy comparison
with analytical solutions is made. We show that a single deep neural network
can learn the solution of partial differential equations for an entire class of
problems.
Related papers
- The Unreasonable Effectiveness of Solving Inverse Problems with Neural Networks [24.766470360665647]
We show that neural networks trained to learn solutions to inverse problems can find better solutions than classicals even on their training set.
Our findings suggest an alternative use for neural networks: rather than generalizing to new data for fast inference, they can also be used to find better solutions on known data.
arXiv Detail & Related papers (2024-08-15T12:38:10Z) - Improving PINNs By Algebraic Inclusion of Boundary and Initial Conditions [0.1874930567916036]
"AI for Science" aims to solve fundamental scientific problems using AI techniques.
In this work we explore the possibility of changing the model being trained from being just a neural network to being a non-linear transformation of it.
This reduces the number of terms in the loss function than the standard PINN losses.
arXiv Detail & Related papers (2024-07-30T11:19:48Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - Evaluating Error Bound for Physics-Informed Neural Networks on Linear
Dynamical Systems [1.2891210250935146]
This paper shows that one can mathematically derive explicit error bounds for physics-informed neural networks trained on a class of linear systems of differential equations.
Our work shows a link between network residuals, which is known and used as loss function, and the absolute error of solution, which is generally unknown.
arXiv Detail & Related papers (2022-07-03T20:23:43Z) - Improved Training of Physics-Informed Neural Networks with Model
Ensembles [81.38804205212425]
We propose to expand the solution interval gradually to make the PINN converge to the correct solution.
All ensemble members converge to the same solution in the vicinity of observed data.
We show experimentally that the proposed method can improve the accuracy of the found solution.
arXiv Detail & Related papers (2022-04-11T14:05:34Z) - Physics-Informed Neural Networks for Quantum Eigenvalue Problems [1.2891210250935146]
Eigenvalue problems are critical to several fields of science and engineering.
We use unsupervised neural networks for discovering eigenfunctions and eigenvalues for differential eigenvalue problems.
The network optimization is data-free and depends solely on the predictions of the neural network.
arXiv Detail & Related papers (2022-02-24T18:29:39Z) - Physics informed neural networks for continuum micromechanics [68.8204255655161]
Recently, physics informed neural networks have successfully been applied to a broad variety of problems in applied mathematics and engineering.
Due to the global approximation, physics informed neural networks have difficulties in displaying localized effects and strong non-linear solutions by optimization.
It is shown, that the domain decomposition approach is able to accurately resolve nonlinear stress, displacement and energy fields in heterogeneous microstructures obtained from real-world $mu$CT-scans.
arXiv Detail & Related papers (2021-10-14T14:05:19Z) - Training Feedback Spiking Neural Networks by Implicit Differentiation on
the Equilibrium State [66.2457134675891]
Spiking neural networks (SNNs) are brain-inspired models that enable energy-efficient implementation on neuromorphic hardware.
Most existing methods imitate the backpropagation framework and feedforward architectures for artificial neural networks.
We propose a novel training method that does not rely on the exact reverse of the forward computation.
arXiv Detail & Related papers (2021-09-29T07:46:54Z) - Learning in Sinusoidal Spaces with Physics-Informed Neural Networks [22.47355575565345]
A physics-informed neural network (PINN) uses physics-augmented loss functions to ensure its output is consistent with fundamental physics laws.
It turns out to be difficult to train an accurate PINN model for many problems in practice.
arXiv Detail & Related papers (2021-09-20T07:42:41Z) - Characterizing possible failure modes in physics-informed neural
networks [55.83255669840384]
Recent work in scientific machine learning has developed so-called physics-informed neural network (PINN) models.
We demonstrate that, while existing PINN methodologies can learn good models for relatively trivial problems, they can easily fail to learn relevant physical phenomena even for simple PDEs.
We show that these possible failure modes are not due to the lack of expressivity in the NN architecture, but that the PINN's setup makes the loss landscape very hard to optimize.
arXiv Detail & Related papers (2021-09-02T16:06:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.