Solving Nonlinear Energy Supply and Demand System Using Physics-Informed Neural Networks
- URL: http://arxiv.org/abs/2412.17001v1
- Date: Sun, 22 Dec 2024 12:37:59 GMT
- Title: Solving Nonlinear Energy Supply and Demand System Using Physics-Informed Neural Networks
- Authors: Van Truong Vo, Samad Noeiaghdam, Denis Sidorov, Aliona Dreglea, Liguo Wang,
- Abstract summary: We propose a method utilizing Physics-Informed Neural Networks (PINNs) to solve the nonlinear energy supply-demand system.
We design a neural network with four outputs, where each output approximates a function that corresponds to one of the unknown functions in the nonlinear system of differential equations.
The solutions obtained from the neural network for this problem are equivalent when we compare and evaluate them against the Runge-Kutta numerical method of order 4/5 (RK45)
- Score: 1.5728609542259502
- License:
- Abstract: Nonlinear differential equations and systems play a crucial role in modeling systems where time-dependent factors exhibit nonlinear characteristics. Due to their nonlinear nature, solving such systems often presents significant difficulties and challenges. In this study, we propose a method utilizing Physics-Informed Neural Networks (PINNs) to solve the nonlinear energy supply-demand (ESD) system. We design a neural network with four outputs, where each output approximates a function that corresponds to one of the unknown functions in the nonlinear system of differential equations describing the four-dimensional ESD problem. The neural network model is then trained and the parameters are identified, optimized to achieve a more accurate solution. The solutions obtained from the neural network for this problem are equivalent when we compare and evaluate them against the Runge-Kutta numerical method of order 4/5 (RK45). However, the method utilizing neural networks is considered a modern and promising approach, as it effectively exploits the superior computational power of advanced computer systems, especially in solving complex problems. Another advantage is that the neural network model, after being trained, can solve the nonlinear system of differential equations across a continuous domain. In other words, neural networks are not only trained to approximate the solution functions for the nonlinear ESD system but can also represent the complex dynamic relationships between the system's components. However, this approach requires significant time and computational power due to the need for model training.
Related papers
- Newton Informed Neural Operator for Computing Multiple Solutions of Nonlinear Partials Differential Equations [3.8916312075738273]
We propose a novel approach called the Newton Informed Neural Operator to tackle nonlinearities.
Our method combines classical Newton methods, addressing well-posed problems, and efficiently learns multiple solutions in a single learning process.
arXiv Detail & Related papers (2024-05-23T01:52:54Z) - Mechanistic Neural Networks for Scientific Machine Learning [58.99592521721158]
We present Mechanistic Neural Networks, a neural network design for machine learning applications in the sciences.
It incorporates a new Mechanistic Block in standard architectures to explicitly learn governing differential equations as representations.
Central to our approach is a novel Relaxed Linear Programming solver (NeuRLP) inspired by a technique that reduces solving linear ODEs to solving linear programs.
arXiv Detail & Related papers (2024-02-20T15:23:24Z) - Linearization of ReLU Activation Function for Neural Network-Embedded
Optimization:Optimal Day-Ahead Energy Scheduling [0.2900810893770134]
In some applications such as battery degradation neural network-based microgrid day-ahead energy scheduling, the input features of the trained learning model are variables to be solved in optimization models.
The use of nonlinear activation functions in the neural network will make such problems extremely hard to solve if not unsolvable.
This paper investigated different methods for linearizing the nonlinear activation functions with a particular focus on the widely used rectified linear unit (ReLU) function.
arXiv Detail & Related papers (2023-10-03T02:47:38Z) - A Stable and Scalable Method for Solving Initial Value PDEs with Neural
Networks [52.5899851000193]
We develop an ODE based IVP solver which prevents the network from getting ill-conditioned and runs in time linear in the number of parameters.
We show that current methods based on this approach suffer from two key issues.
First, following the ODE produces an uncontrolled growth in the conditioning of the problem, ultimately leading to unacceptably large numerical errors.
arXiv Detail & Related papers (2023-04-28T17:28:18Z) - Physics guided neural networks for modelling of non-linear dynamics [0.0]
This work demonstrates that injection of partially known information at an intermediate layer in a deep neural network can improve model accuracy, reduce model uncertainty, and yield improved convergence during the training.
The value of these physics-guided neural networks has been demonstrated by learning the dynamics of a wide variety of nonlinear dynamical systems represented by five well-known equations in nonlinear systems theory.
arXiv Detail & Related papers (2022-05-13T19:06:36Z) - Physics-informed attention-based neural network for solving non-linear
partial differential equations [6.103365780339364]
Physics-Informed Neural Networks (PINNs) have enabled significant improvements in modelling physical processes.
PINNs are based on simple architectures, and learn the behavior of complex physical systems by optimizing the network parameters to minimize the residual of the underlying PDE.
Here, we address the question of which network architectures are best suited to learn the complex behavior of non-linear PDEs.
arXiv Detail & Related papers (2021-05-17T14:29:08Z) - Conditional physics informed neural networks [85.48030573849712]
We introduce conditional PINNs (physics informed neural networks) for estimating the solution of classes of eigenvalue problems.
We show that a single deep neural network can learn the solution of partial differential equations for an entire class of problems.
arXiv Detail & Related papers (2021-04-06T18:29:14Z) - Linear embedding of nonlinear dynamical systems and prospects for
efficient quantum algorithms [74.17312533172291]
We describe a method for mapping any finite nonlinear dynamical system to an infinite linear dynamical system (embedding)
We then explore an approach for approximating the resulting infinite linear system with finite linear systems (truncation)
arXiv Detail & Related papers (2020-12-12T00:01:10Z) - Large-scale Neural Solvers for Partial Differential Equations [48.7576911714538]
Solving partial differential equations (PDE) is an indispensable part of many branches of science as many processes can be modelled in terms of PDEs.
Recent numerical solvers require manual discretization of the underlying equation as well as sophisticated, tailored code for distributed computing.
We examine the applicability of continuous, mesh-free neural solvers for partial differential equations, physics-informed neural networks (PINNs)
We discuss the accuracy of GatedPINN with respect to analytical solutions -- as well as state-of-the-art numerical solvers, such as spectral solvers.
arXiv Detail & Related papers (2020-09-08T13:26:51Z) - DynNet: Physics-based neural architecture design for linear and
nonlinear structural response modeling and prediction [2.572404739180802]
In this study, a physics-based recurrent neural network model is designed that is able to learn the dynamics of linear and nonlinear multiple degrees of freedom systems.
The model is able to estimate a complete set of responses, including displacement, velocity, acceleration, and internal forces.
arXiv Detail & Related papers (2020-07-03T17:05:35Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.