Incorporating sufficient physical information into artificial neural
networks: a guaranteed improvement via physics-based Rao-Blackwellization
- URL: http://arxiv.org/abs/2311.06147v1
- Date: Fri, 10 Nov 2023 16:05:46 GMT
- Title: Incorporating sufficient physical information into artificial neural
networks: a guaranteed improvement via physics-based Rao-Blackwellization
- Authors: Gian-Luca Geuken, J\"orn Mosler and Patrick Kurzeja
- Abstract summary: The concept of Rao-Blackwellization is employed to improve predictions of artificial neural networks by physical information.
The proposed strategy is applied to material modeling and illustrated by examples of the identification of a yield function.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The concept of Rao-Blackwellization is employed to improve predictions of
artificial neural networks by physical information. The error norm and the
proof of improvement are transferred from the original statistical concept to a
deterministic one, using sufficient information on physics-based conditions.
The proposed strategy is applied to material modeling and illustrated by
examples of the identification of a yield function, elasto-plastic steel
simulations, the identification of driving forces for quasi-brittle damage and
rubber experiments. Sufficient physical information is employed, e.g., in the
form of invariants, parameters of a minimization problem, dimensional analysis,
isotropy and differentiability. It is proven how intuitive accretion of
information can yield improvement if it is physically sufficient, but also how
insufficient or superfluous information can cause impairment. Opportunities for
the improvement of artificial neural networks are explored in terms of the
training data set, the networks' structure and output filters. Even crude
initial predictions are remarkably improved by reducing noise, overfitting and
data requirements.
Related papers
- Can physical information aid the generalization ability of Neural
Networks for hydraulic modeling? [0.0]
Application of Neural Networks to river hydraulics is fledgling, despite the field suffering from data scarcity.
We propose to mitigate such problem by introducing physical information into the training phase.
We show that incorporating such soft physical information can improve predictive capabilities.
arXiv Detail & Related papers (2024-03-13T14:51:16Z) - Hybrid data-driven and physics-informed regularized learning of cyclic
plasticity with Neural Networks [0.0]
The proposed model architecture is simpler and more efficient compared to existing solutions from the literature.
The validation of the approach is carried out by means of surrogate data obtained with the Armstrong-Frederick kinematic hardening model.
arXiv Detail & Related papers (2024-03-04T07:09:54Z) - Physics-Informed Neural Networks with Hard Linear Equality Constraints [9.101849365688905]
This work proposes a novel physics-informed neural network, KKT-hPINN, which rigorously guarantees hard linear equality constraints.
Experiments on Aspen models of a stirred-tank reactor unit, an extractive distillation subsystem, and a chemical plant demonstrate that this model can further enhance the prediction accuracy.
arXiv Detail & Related papers (2024-02-11T17:40:26Z) - Physics-aware deep learning framework for linear elasticity [0.0]
The paper presents an efficient and robust data-driven deep learning (DL) computational framework for linear continuum elasticity problems.
For an accurate representation of the field variables, a multi-objective loss function is proposed.
Several benchmark problems including the Airimaty solution to elasticity and the Kirchhoff-Love plate problem are solved.
arXiv Detail & Related papers (2023-02-19T20:33:32Z) - Neural Implicit Representations for Physical Parameter Inference from a Single Video [49.766574469284485]
We propose to combine neural implicit representations for appearance modeling with neural ordinary differential equations (ODEs) for modelling physical phenomena.
Our proposed model combines several unique advantages: (i) Contrary to existing approaches that require large training datasets, we are able to identify physical parameters from only a single video.
The use of neural implicit representations enables the processing of high-resolution videos and the synthesis of photo-realistic images.
arXiv Detail & Related papers (2022-04-29T11:55:35Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Generative Counterfactuals for Neural Networks via Attribute-Informed
Perturbation [51.29486247405601]
We design a framework to generate counterfactuals for raw data instances with the proposed Attribute-Informed Perturbation (AIP)
By utilizing generative models conditioned with different attributes, counterfactuals with desired labels can be obtained effectively and efficiently.
Experimental results on real-world texts and images demonstrate the effectiveness, sample quality as well as efficiency of our designed framework.
arXiv Detail & Related papers (2021-01-18T08:37:13Z) - Gradient Starvation: A Learning Proclivity in Neural Networks [97.02382916372594]
Gradient Starvation arises when cross-entropy loss is minimized by capturing only a subset of features relevant for the task.
This work provides a theoretical explanation for the emergence of such feature imbalance in neural networks.
arXiv Detail & Related papers (2020-11-18T18:52:08Z) - Identification of state functions by physically-guided neural networks
with physically-meaningful internal layers [0.0]
We use the concept of physically-constrained neural networks (PCNN) to predict the input-output relation in a physical system.
We show that this approach, besides getting physically-based predictions, accelerates the training process.
arXiv Detail & Related papers (2020-11-17T11:26:37Z) - Network Diffusions via Neural Mean-Field Dynamics [52.091487866968286]
We propose a novel learning framework for inference and estimation problems of diffusion on networks.
Our framework is derived from the Mori-Zwanzig formalism to obtain an exact evolution of the node infection probabilities.
Our approach is versatile and robust to variations of the underlying diffusion network models.
arXiv Detail & Related papers (2020-06-16T18:45:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.