Multi-Constitutive Neural Network for Large Deformation Poromechanics Problem
- URL: http://arxiv.org/abs/2010.15549v4
- Date: Wed, 12 Jun 2024 05:41:50 GMT
- Title: Multi-Constitutive Neural Network for Large Deformation Poromechanics Problem
- Authors: Qi Zhang, Yilin Chen, Ziyi Yang, Eric Darve,
- Abstract summary: We propose a novel method "multi-constitutive neural network" (MCNN) that one model can solve several different laws.
We found that MCNN trained to solve multiple PDEs outperforms individual neural network solvers trained with PDE in some cases.
- Score: 21.894584868482916
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we study the problem of large-strain consolidation in poromechanics with deep neural networks (DNN). Given different material properties and different loading conditions, the goal is to predict pore pressure and settlement. We propose a novel method "multi-constitutive neural network" (MCNN) such that one model can solve several different constitutive laws. We introduce a one-hot encoding vector as an additional input vector, which is used to label the constitutive law we wish to solve. Then we build a DNN which takes $(\hat{X}, \hat{t})$ as input along with a constitutive law label and outputs the corresponding solution. It is the first time, to our knowledge, that we can evaluate multi-constitutive laws through only one training process while still obtaining good accuracies. We found that MCNN trained to solve multiple PDEs outperforms individual neural network solvers trained with PDE in some cases.
Related papers
- Node Assigned physics-informed neural networks for thermal-hydraulic system simulation: CVH/FL module [0.0]
Severe accidents (SAs) in nuclear power plants have been analyzed using thermal-hydraulic (TH) system codes such as MELCOR and MAAP.
The objective of this study is to develop a novel numerical method for TH system codes using physics-informed neural network (PINN)
arXiv Detail & Related papers (2025-04-23T06:17:04Z) - Input Specific Neural Networks [0.0]
The black-box of neural networks limits the ability to encode or impose specific relationships between inputs and outputs.
This paper presents two ISNNs, along with equations for first second derivatives of respect to the inputs.
We show how ISNNs can be used to learn structural relationships between inputs outputs via a binary gating mechanism.
arXiv Detail & Related papers (2025-03-01T00:57:16Z) - Improving PINNs By Algebraic Inclusion of Boundary and Initial Conditions [0.1874930567916036]
"AI for Science" aims to solve fundamental scientific problems using AI techniques.
In this work we explore the possibility of changing the model being trained from being just a neural network to being a non-linear transformation of it.
This reduces the number of terms in the loss function than the standard PINN losses.
arXiv Detail & Related papers (2024-07-30T11:19:48Z) - LinSATNet: The Positive Linear Satisfiability Neural Networks [116.65291739666303]
This paper studies how to introduce the popular positive linear satisfiability to neural networks.
We propose the first differentiable satisfiability layer based on an extension of the classic Sinkhorn algorithm for jointly encoding multiple sets of marginal distributions.
arXiv Detail & Related papers (2024-07-18T22:05:21Z) - iPINNs: Incremental learning for Physics-informed neural networks [66.4795381419701]
Physics-informed neural networks (PINNs) have recently become a powerful tool for solving partial differential equations (PDEs)
We propose incremental PINNs that can learn multiple tasks sequentially without additional parameters for new tasks and improve performance for every equation in the sequence.
Our approach learns multiple PDEs starting from the simplest one by creating its own subnetwork for each PDE and allowing each subnetwork to overlap with previously learnedworks.
arXiv Detail & Related papers (2023-04-10T20:19:20Z) - Improved Training of Physics-Informed Neural Networks with Model
Ensembles [81.38804205212425]
We propose to expand the solution interval gradually to make the PINN converge to the correct solution.
All ensemble members converge to the same solution in the vicinity of observed data.
We show experimentally that the proposed method can improve the accuracy of the found solution.
arXiv Detail & Related papers (2022-04-11T14:05:34Z) - Training Feedback Spiking Neural Networks by Implicit Differentiation on
the Equilibrium State [66.2457134675891]
Spiking neural networks (SNNs) are brain-inspired models that enable energy-efficient implementation on neuromorphic hardware.
Most existing methods imitate the backpropagation framework and feedforward architectures for artificial neural networks.
We propose a novel training method that does not rely on the exact reverse of the forward computation.
arXiv Detail & Related papers (2021-09-29T07:46:54Z) - Pruning and Slicing Neural Networks using Formal Verification [0.2538209532048866]
Deep neural networks (DNNs) play an increasingly important role in various computer systems.
In order to create these networks, engineers typically specify a desired topology, and then use an automated training algorithm to select the network's weights.
Here, we propose to address this challenge by harnessing recent advances in DNN verification.
arXiv Detail & Related papers (2021-05-28T07:53:50Z) - Conditional physics informed neural networks [85.48030573849712]
We introduce conditional PINNs (physics informed neural networks) for estimating the solution of classes of eigenvalue problems.
We show that a single deep neural network can learn the solution of partial differential equations for an entire class of problems.
arXiv Detail & Related papers (2021-04-06T18:29:14Z) - dNNsolve: an efficient NN-based PDE solver [62.997667081978825]
We introduce dNNsolve, that makes use of dual Neural Networks to solve ODEs/PDEs.
We show that dNNsolve is capable of solving a broad range of ODEs/PDEs in 1, 2 and 3 spacetime dimensions.
arXiv Detail & Related papers (2021-03-15T19:14:41Z) - Physics informed deep learning for computational elastodynamics without
labeled data [13.084113582897965]
We present a physics-informed neural network (PINN) with mixed-variable output to model elastodynamics problems without resort to labeled data.
Results show the promise of PINN in the context of computational mechanics applications.
arXiv Detail & Related papers (2020-06-10T19:05:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.