Physics-Informed Neural Network for Discovering Systems with
Unmeasurable States with Application to Lithium-Ion Batteries
- URL: http://arxiv.org/abs/2311.16374v1
- Date: Mon, 27 Nov 2023 23:35:40 GMT
- Title: Physics-Informed Neural Network for Discovering Systems with
Unmeasurable States with Application to Lithium-Ion Batteries
- Authors: Yuichi Kajiura, Jorge Espin, Dong Zhang
- Abstract summary: We introduce a robust method for training PINN that uses fewer loss terms and thus constructs a less complex landscape for optimization.
Instead of having loss terms from each differential equation, this method embeds the dynamics into a loss function that quantifies the error between observed and predicted system outputs.
This is accomplished by numerically integrating the predicted states from the neural network(NN) using known dynamics and transforming them to obtain a sequence of predicted outputs.
- Score: 6.375364752891239
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Combining machine learning with physics is a trending approach for
discovering unknown dynamics, and one of the most intensively studied
frameworks is the physics-informed neural network (PINN). However, PINN often
fails to optimize the network due to its difficulty in concurrently minimizing
multiple losses originating from the system's governing equations. This problem
can be more serious when the system's states are unmeasurable, like lithium-ion
batteries (LiBs). In this work, we introduce a robust method for training PINN
that uses fewer loss terms and thus constructs a less complex landscape for
optimization. In particular, instead of having loss terms from each
differential equation, this method embeds the dynamics into a loss function
that quantifies the error between observed and predicted system outputs. This
is accomplished by numerically integrating the predicted states from the neural
network(NN) using known dynamics and transforming them to obtain a sequence of
predicted outputs. Minimizing such a loss optimizes the NN to predict states
consistent with observations given the physics. Further, the system's
parameters can be added to the optimization targets. To demonstrate the ability
of this method to perform various modeling and control tasks, we apply it to a
battery model to concurrently estimate its states and parameters.
Related papers
- Response Estimation and System Identification of Dynamical Systems via Physics-Informed Neural Networks [0.0]
This paper explores the use of Physics-Informed Neural Networks (PINNs) for the identification and estimation of dynamical systems.
PINNs offer a unique advantage by embedding known physical laws directly into the neural network's loss function, allowing for simple embedding of complex phenomena.
The results demonstrate that PINNs deliver an efficient tool across all aforementioned tasks, even in presence of modelling errors.
arXiv Detail & Related papers (2024-10-02T08:58:30Z) - Enriched Physics-informed Neural Networks for Dynamic
Poisson-Nernst-Planck Systems [0.8192907805418583]
This paper proposes a meshless deep learning algorithm, enriched physics-informed neural networks (EPINNs) to solve dynamic Poisson-Nernst-Planck (PNP) equations.
The EPINNs takes the traditional physics-informed neural networks as the foundation framework, and adds the adaptive loss weight to balance the loss functions.
Numerical results indicate that the new method has better applicability than traditional numerical methods in solving such coupled nonlinear systems.
arXiv Detail & Related papers (2024-02-01T02:57:07Z) - Model-Based Control with Sparse Neural Dynamics [23.961218902837807]
We propose a new framework for integrated model learning and predictive control.
We show that our framework can deliver better closed-loop performance than existing state-of-the-art methods.
arXiv Detail & Related papers (2023-12-20T06:25:02Z) - NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition [67.46012350241969]
This paper proposes a general acceleration methodology called NeuralStagger.
It decomposing the original learning tasks into several coarser-resolution subtasks.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid dynamics simulations.
arXiv Detail & Related papers (2023-02-20T19:36:52Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - Adaptive Self-supervision Algorithms for Physics-informed Neural
Networks [59.822151945132525]
Physics-informed neural networks (PINNs) incorporate physical knowledge from the problem domain as a soft constraint on the loss function.
We study the impact of the location of the collocation points on the trainability of these models.
We propose a novel adaptive collocation scheme which progressively allocates more collocation points to areas where the model is making higher errors.
arXiv Detail & Related papers (2022-07-08T18:17:06Z) - Characterizing possible failure modes in physics-informed neural
networks [55.83255669840384]
Recent work in scientific machine learning has developed so-called physics-informed neural network (PINN) models.
We demonstrate that, while existing PINN methodologies can learn good models for relatively trivial problems, they can easily fail to learn relevant physical phenomena even for simple PDEs.
We show that these possible failure modes are not due to the lack of expressivity in the NN architecture, but that the PINN's setup makes the loss landscape very hard to optimize.
arXiv Detail & Related papers (2021-09-02T16:06:45Z) - Data vs. Physics: The Apparent Pareto Front of Physics-Informed Neural Networks [8.487185704099925]
Physics-informed neural networks (PINNs) have emerged as a promising deep learning method.
PINNs are difficult to train and often require a careful tuning of loss weights when data and physics loss functions are combined.
arXiv Detail & Related papers (2021-05-03T13:47:45Z) - Physics-Informed Neural Network Method for Solving One-Dimensional
Advection Equation Using PyTorch [0.0]
PINNs approach allows training neural networks while respecting the PDEs as a strong constraint in the optimization.
In standard small-scale circulation simulations, it is shown that the conventional approach incorporates a pseudo diffusive effect that is almost as large as the effect of the turbulent diffusion model.
Of all the schemes tested, only the PINNs approximation accurately predicted the outcome.
arXiv Detail & Related papers (2021-03-15T05:39:17Z) - A Meta-Learning Approach to the Optimal Power Flow Problem Under
Topology Reconfigurations [69.73803123972297]
We propose a DNN-based OPF predictor that is trained using a meta-learning (MTL) approach.
The developed OPF-predictor is validated through simulations using benchmark IEEE bus systems.
arXiv Detail & Related papers (2020-12-21T17:39:51Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.