Physics-informed Gaussian process model for Euler-Bernoulli beam
elements
- URL: http://arxiv.org/abs/2308.02894v1
- Date: Sat, 5 Aug 2023 14:48:37 GMT
- Title: Physics-informed Gaussian process model for Euler-Bernoulli beam
elements
- Authors: Gledson Rodrigo Tondo and Sebastian Rau and Igor Kavrakov and Guido
Morgenthal
- Abstract summary: A physics-informed machine learning model is formulated using the Euler-Bernoulli beam equation.
The model can be used to regress the analytical value of the structure's bending stiffness, interpolate responses, and make probabilistic inferences on latent physical quantities.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: A physics-informed machine learning model, in the form of a multi-output
Gaussian process, is formulated using the Euler-Bernoulli beam equation. Given
appropriate datasets, the model can be used to regress the analytical value of
the structure's bending stiffness, interpolate responses, and make
probabilistic inferences on latent physical quantities. The developed model is
applied on a numerically simulated cantilever beam, where the regressed bending
stiffness is evaluated and the influence measurement noise on the prediction
quality is investigated. Further, the regressed probabilistic stiffness
distribution is used in a structural health monitoring context, where the
Mahalanobis distance is employed to reason about the possible location and
extent of damage in the structural system. To validate the developed framework,
an experiment is conducted and measured heterogeneous datasets are used to
update the assumed analytical structural model.
Related papers
- Diffusion posterior sampling for simulation-based inference in tall data settings [53.17563688225137]
Simulation-based inference ( SBI) is capable of approximating the posterior distribution that relates input parameters to a given observation.
In this work, we consider a tall data extension in which multiple observations are available to better infer the parameters of the model.
We compare our method to recently proposed competing approaches on various numerical experiments and demonstrate its superiority in terms of numerical stability and computational cost.
arXiv Detail & Related papers (2024-04-11T09:23:36Z) - Discovering Interpretable Physical Models using Symbolic Regression and
Discrete Exterior Calculus [55.2480439325792]
We propose a framework that combines Symbolic Regression (SR) and Discrete Exterior Calculus (DEC) for the automated discovery of physical models.
DEC provides building blocks for the discrete analogue of field theories, which are beyond the state-of-the-art applications of SR to physical problems.
We prove the effectiveness of our methodology by re-discovering three models of Continuum Physics from synthetic experimental data.
arXiv Detail & Related papers (2023-10-10T13:23:05Z) - Stochastic stiffness identification and response estimation of
Timoshenko beams via physics-informed Gaussian processes [0.0]
This paper presents a physics-informed Gaussian process (GP) model for Timoshenko beam elements.
The proposed approach is effective at identifying structural parameters and is capable of fusing data from heterogeneous and multi-fidelity sensors.
arXiv Detail & Related papers (2023-09-21T08:22:12Z) - A physics-informed machine learning model for reconstruction of dynamic
loads [0.0]
This paper presents a physics-informed machine-learning framework for reconstructing dynamic forces based on measured deflections, velocities, or accelerations.
The framework can work with incomplete and contaminated data and offers a natural regularization approach to account for noise measurement system.
Uses of the developed framework include design models and assumptions, as well as prognosis of responses to assist in damage detection and health monitoring.
arXiv Detail & Related papers (2023-08-15T18:33:58Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Generative models and Bayesian inversion using Laplace approximation [0.3670422696827525]
Recently, inverse problems were solved using generative models as highly informative priors.
We show that derived Bayes estimates are consistent, in contrast to the approach employing the low-dimensional manifold of the generative model.
arXiv Detail & Related papers (2022-03-15T10:05:43Z) - Mixed Effects Neural ODE: A Variational Approximation for Analyzing the
Dynamics of Panel Data [50.23363975709122]
We propose a probabilistic model called ME-NODE to incorporate (fixed + random) mixed effects for analyzing panel data.
We show that our model can be derived using smooth approximations of SDEs provided by the Wong-Zakai theorem.
We then derive Evidence Based Lower Bounds for ME-NODE, and develop (efficient) training algorithms.
arXiv Detail & Related papers (2022-02-18T22:41:51Z) - Uncertainty quantification in a mechanical submodel driven by a
Wasserstein-GAN [0.0]
We show that the use of non-linear techniques in machine learning and data-driven methods is highly relevant.
Generative Adversarial Networks (GANs) are suited for such applications, where the Wasserstein-GAN with gradient penalty variant offers improved results.
arXiv Detail & Related papers (2021-10-26T13:18:06Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.