Identification of state functions by physically-guided neural networks
with physically-meaningful internal layers
- URL: http://arxiv.org/abs/2011.08567v1
- Date: Tue, 17 Nov 2020 11:26:37 GMT
- Title: Identification of state functions by physically-guided neural networks
with physically-meaningful internal layers
- Authors: Jacobo Ayensa-Jim\'enez, Mohamed H. Doweidar, Jose Antonio
Sanz-Herrera, Manuel Doblar\'e
- Abstract summary: We use the concept of physically-constrained neural networks (PCNN) to predict the input-output relation in a physical system.
We show that this approach, besides getting physically-based predictions, accelerates the training process.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Substitution of well-grounded theoretical models by data-driven predictions
is not as simple in engineering and sciences as it is in social and economic
fields. Scientific problems suffer most times from paucity of data, while they
may involve a large number of variables and parameters that interact in complex
and non-stationary ways, obeying certain physical laws. Moreover, a
physically-based model is not only useful for making predictions, but to gain
knowledge by the interpretation of its structure, parameters, and mathematical
properties. The solution to these shortcomings seems to be the seamless
blending of the tremendous predictive power of the data-driven approach with
the scientific consistency and interpretability of physically-based models. We
use here the concept of physically-constrained neural networks (PCNN) to
predict the input-output relation in a physical system, while, at the same time
fulfilling the physical constraints. With this goal, the internal hidden state
variables of the system are associated with a set of internal neuron layers,
whose values are constrained by known physical relations, as well as any
additional knowledge on the system. Furthermore, when having enough data, it is
possible to infer knowledge about the internal structure of the system and, if
parameterized, to predict the state parameters for a particular input-output
relation. We show that this approach, besides getting physically-based
predictions, accelerates the training process, reduces the amount of data
required to get similar accuracy, filters partly the intrinsic noise in the
experimental data and provides improved extrapolation capacity.
Related papers
- Response Estimation and System Identification of Dynamical Systems via Physics-Informed Neural Networks [0.0]
This paper explores the use of Physics-Informed Neural Networks (PINNs) for the identification and estimation of dynamical systems.
PINNs offer a unique advantage by embedding known physical laws directly into the neural network's loss function, allowing for simple embedding of complex phenomena.
The results demonstrate that PINNs deliver an efficient tool across all aforementioned tasks, even in presence of modelling errors.
arXiv Detail & Related papers (2024-10-02T08:58:30Z) - Neural Incremental Data Assimilation [8.817223931520381]
We introduce a deep learning approach where the physical system is modeled as a sequence of coarse-to-fine Gaussian prior distributions parametrized by a neural network.
This allows us to define an assimilation operator, which is trained in an end-to-end fashion to minimize the reconstruction error.
We illustrate our approach on chaotic dynamical physical systems with sparse observations, and compare it to traditional variational data assimilation methods.
arXiv Detail & Related papers (2024-06-21T11:42:55Z) - Incorporating sufficient physical information into artificial neural
networks: a guaranteed improvement via physics-based Rao-Blackwellization [0.0]
The concept of Rao-Blackwellization is employed to improve predictions of artificial neural networks by physical information.
The proposed strategy is applied to material modeling and illustrated by examples of the identification of a yield function.
arXiv Detail & Related papers (2023-11-10T16:05:46Z) - Learning Physical Dynamics with Subequivariant Graph Neural Networks [99.41677381754678]
Graph Neural Networks (GNNs) have become a prevailing tool for learning physical dynamics.
Physical laws abide by symmetry, which is a vital inductive bias accounting for model generalization.
Our model achieves on average over 3% enhancement in contact prediction accuracy across 8 scenarios on Physion and 2X lower rollout MSE on RigidFall.
arXiv Detail & Related papers (2022-10-13T10:00:30Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Leveraging the structure of dynamical systems for data-driven modeling [111.45324708884813]
We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
arXiv Detail & Related papers (2021-12-15T20:09:20Z) - The Causal Neural Connection: Expressiveness, Learnability, and
Inference [125.57815987218756]
An object called structural causal model (SCM) represents a collection of mechanisms and sources of random variation of the system under investigation.
In this paper, we show that the causal hierarchy theorem (Thm. 1, Bareinboim et al., 2020) still holds for neural models.
We introduce a special type of SCM called a neural causal model (NCM), and formalize a new type of inductive bias to encode structural constraints necessary for performing causal inferences.
arXiv Detail & Related papers (2021-07-02T01:55:18Z) - Using Data Assimilation to Train a Hybrid Forecast System that Combines
Machine-Learning and Knowledge-Based Components [52.77024349608834]
We consider the problem of data-assisted forecasting of chaotic dynamical systems when the available data is noisy partial measurements.
We show that by using partial measurements of the state of the dynamical system, we can train a machine learning model to improve predictions made by an imperfect knowledge-based model.
arXiv Detail & Related papers (2021-02-15T19:56:48Z) - On the application of Physically-Guided Neural Networks with Internal
Variables to Continuum Problems [0.0]
We present Physically-Guided Neural Networks with Internal Variables (PGNNIV)
universal physical laws are used as constraints in the neural network, in such a way that some neuron values can be interpreted as internal state variables of the system.
This endows the network with unraveling capacity, as well as better predictive properties such as faster convergence, fewer data needs and additional noise filtering.
We extend this new methodology to continuum physical problems, showing again its predictive and explanatory capacities when only using measurable values in the training set.
arXiv Detail & Related papers (2020-11-23T13:06:52Z) - Parsimonious neural networks learn interpretable physical laws [77.34726150561087]
We propose parsimonious neural networks (PNNs) that combine neural networks with evolutionary optimization to find models that balance accuracy with parsimony.
The power and versatility of the approach is demonstrated by developing models for classical mechanics and to predict the melting temperature of materials from fundamental properties.
arXiv Detail & Related papers (2020-05-08T16:15:47Z) - Incorporating physical constraints in a deep probabilistic machine
learning framework for coarse-graining dynamical systems [7.6146285961466]
This paper offers a data-based, probablistic perspective that enables the quantification of predictive uncertainties.
We formulate the coarse-graining process by employing a probabilistic state-space model.
It is capable of reconstructing the evolution of the full, fine-scale system.
arXiv Detail & Related papers (2019-12-30T16:07:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.