Physics-Informed Neural Networks with Hard Linear Equality Constraints
- URL: http://arxiv.org/abs/2402.07251v1
- Date: Sun, 11 Feb 2024 17:40:26 GMT
- Title: Physics-Informed Neural Networks with Hard Linear Equality Constraints
- Authors: Hao Chen, Gonzalo E. Constante Flores, Can Li
- Abstract summary: This work proposes a novel physics-informed neural network, KKT-hPINN, which rigorously guarantees hard linear equality constraints.
Experiments on Aspen models of a stirred-tank reactor unit, an extractive distillation subsystem, and a chemical plant demonstrate that this model can further enhance the prediction accuracy.
- Score: 9.101849365688905
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Surrogate modeling is used to replace computationally expensive simulations.
Neural networks have been widely applied as surrogate models that enable
efficient evaluations over complex physical systems. Despite this, neural
networks are data-driven models and devoid of any physics. The incorporation of
physics into neural networks can improve generalization and data efficiency.
The physics-informed neural network (PINN) is an approach to leverage known
physical constraints present in the data, but it cannot strictly satisfy them
in the predictions. This work proposes a novel physics-informed neural network,
KKT-hPINN, which rigorously guarantees hard linear equality constraints through
projection layers derived from KKT conditions. Numerical experiments on Aspen
models of a continuous stirred-tank reactor (CSTR) unit, an extractive
distillation subsystem, and a chemical plant demonstrate that this model can
further enhance the prediction accuracy.
Related papers
- Mechanistic Neural Networks for Scientific Machine Learning [58.99592521721158]
We present Mechanistic Neural Networks, a neural network design for machine learning applications in the sciences.
It incorporates a new Mechanistic Block in standard architectures to explicitly learn governing differential equations as representations.
Central to our approach is a novel Relaxed Linear Programming solver (NeuRLP) inspired by a technique that reduces solving linear ODEs to solving linear programs.
arXiv Detail & Related papers (2024-02-20T15:23:24Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition [67.46012350241969]
This paper proposes a general acceleration methodology called NeuralStagger.
It decomposing the original learning tasks into several coarser-resolution subtasks.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid dynamics simulations.
arXiv Detail & Related papers (2023-02-20T19:36:52Z) - Bayesian Physics-Informed Neural Networks for real-world nonlinear
dynamical systems [0.0]
We integrate data, physics, and uncertainties by combining neural networks, physics-informed modeling, and Bayesian inference.
Our study reveals the inherent advantages and disadvantages of Neural Networks, Bayesian Inference, and a combination of both.
We anticipate that the underlying concepts and trends generalize to more complex disease conditions.
arXiv Detail & Related papers (2022-05-12T19:04:31Z) - An advanced spatio-temporal convolutional recurrent neural network for
storm surge predictions [73.4962254843935]
We study the capability of artificial neural network models to emulate storm surge based on the storm track/size/intensity history.
This study presents a neural network model that can predict storm surge, informed by a database of synthetic storm simulations.
arXiv Detail & Related papers (2022-04-18T23:42:18Z) - Neural net modeling of equilibria in NSTX-U [0.0]
We develop two neural networks relevant to equilibrium and shape control modeling.
Networks include Eqnet, a free-boundary equilibrium solver trained on the EFIT01 reconstruction algorithm, and Pertnet, which is trained on the Gspert code.
We report strong performance for both networks indicating that these models could reliably be used within closed-loop simulations.
arXiv Detail & Related papers (2022-02-28T16:09:58Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - On feedforward control using physics-guided neural networks: Training
cost regularization and optimized initialization [0.0]
Performance of model-based feedforward controllers is typically limited by the accuracy of the inverse system dynamics model.
This paper proposes a regularization method via identified physical parameters.
It is validated on a real-life industrial linear motor, where it delivers better tracking accuracy and extrapolation.
arXiv Detail & Related papers (2022-01-28T12:51:25Z) - Thermodynamic Consistent Neural Networks for Learning Material
Interfacial Mechanics [6.087530833458481]
The traction-separation relations (TSR) quantitatively describe the mechanical behavior of a material interface undergoing openings.
A neural network can fit well along with the loading paths but often fails to obey the laws of physics.
We propose a thermodynamic consistent neural network (TCNN) approach to build a data-driven model of the TSR with sparse experimental data.
arXiv Detail & Related papers (2020-11-28T17:25:10Z) - Thermodynamics-based Artificial Neural Networks for constitutive
modeling [0.0]
We propose a new class of data-driven, physics-based, neural networks for modeling of strain rate independent processes at the material point level.
The two basic principles of thermodynamics are encoded in the network's architecture by taking advantage of automatic differentiation.
We demonstrate the wide applicability of TANNs for modeling elasto-plastic materials, with strain hardening and softening strain.
arXiv Detail & Related papers (2020-05-25T15:56:34Z) - Flexible Transmitter Network [84.90891046882213]
Current neural networks are mostly built upon the MP model, which usually formulates the neuron as executing an activation function on the real-valued weighted aggregation of signals received from other neurons.
We propose the Flexible Transmitter (FT) model, a novel bio-plausible neuron model with flexible synaptic plasticity.
We present the Flexible Transmitter Network (FTNet), which is built on the most common fully-connected feed-forward architecture.
arXiv Detail & Related papers (2020-04-08T06:55:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.