Gaussian Process Port-Hamiltonian Systems: Bayesian Learning with
Physics Prior
- URL: http://arxiv.org/abs/2305.09017v1
- Date: Mon, 15 May 2023 20:59:41 GMT
- Title: Gaussian Process Port-Hamiltonian Systems: Bayesian Learning with
Physics Prior
- Authors: Thomas Beckers, Jacob Seidman, Paris Perdikaris, George J. Pappas
- Abstract summary: Data-driven approaches achieve remarkable results for the modeling of complex dynamics based on collected data.
These models often neglect basic physical principles which determine the behavior of any real-world system.
We propose a physics-informed Bayesian learning approach with uncertainty quantification.
- Score: 17.812064311297117
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Data-driven approaches achieve remarkable results for the modeling of complex
dynamics based on collected data. However, these models often neglect basic
physical principles which determine the behavior of any real-world system. This
omission is unfavorable in two ways: The models are not as data-efficient as
they could be by incorporating physical prior knowledge, and the model itself
might not be physically correct. We propose Gaussian Process Port-Hamiltonian
systems (GP-PHS) as a physics-informed Bayesian learning approach with
uncertainty quantification. The Bayesian nature of GP-PHS uses collected data
to form a distribution over all possible Hamiltonians instead of a single point
estimate. Due to the underlying physics model, a GP-PHS generates passive
systems with respect to designated inputs and outputs. Further, the proposed
approach preserves the compositional nature of Port-Hamiltonian systems.
Related papers
- Physics-Constrained Learning for PDE Systems with Uncertainty Quantified Port-Hamiltonian Models [0.7350858947639451]
We propose a physics-constrained learning method that combines powerful learning tools and reliable physical models.
Based on the Bayesian nature of the Gaussian process, we not only learn the dynamics of the system, but also enable uncertainty quantification.
arXiv Detail & Related papers (2024-06-17T17:52:01Z) - Stochastic Inference of Plate Bending from Heterogeneous Data: Physics-informed Gaussian Processes via Kirchhoff-Love Theory [0.0]
We propose an inference methodology for classical Kirchhoff-Love plates via physics-informed Gaussian Processes (GP)
A probabilistic model is formulated as a multi-output GP by placing a GP prior on the deflection and deriving the covariance function using the linear differential operators of the plate governing equations.
We demonstrate the applicability with two examples: a supported plate subjected to a sinusoidal load and a fixed plate subjected to a uniform load.
arXiv Detail & Related papers (2024-05-21T13:53:58Z) - Fusion of Gaussian Processes Predictions with Monte Carlo Sampling [61.31380086717422]
In science and engineering, we often work with models designed for accurate prediction of variables of interest.
Recognizing that these models are approximations of reality, it becomes desirable to apply multiple models to the same data and integrate their outcomes.
arXiv Detail & Related papers (2024-03-03T04:21:21Z) - Lie-Poisson Neural Networks (LPNets): Data-Based Computing of
Hamiltonian Systems with Symmetries [0.0]
An accurate data-based prediction of the long-term evolution of Hamiltonian systems requires a network that preserves the appropriate structure under each time step.
We present two flavors of such systems: one, where the parameters of transformations are computed from data using a dense neural network (LPNets), and another, where the composition of transformations is used as building blocks (G-LPNets)
The resulting methods are important for the construction of accurate data-based methods for simulating the long-term dynamics of physical systems.
arXiv Detail & Related papers (2023-08-29T14:45:23Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Port-Hamiltonian Neural Networks with State Dependent Ports [58.720142291102135]
We stress-test the method on both simple mass-spring systems and more complex and realistic systems with several internal and external forces.
Port-Hamiltonian neural networks can be extended to larger dimensions with state-dependent ports.
We propose a symmetric high-order integrator for improved training on sparse and noisy data.
arXiv Detail & Related papers (2022-06-06T14:57:25Z) - Leveraging the structure of dynamical systems for data-driven modeling [111.45324708884813]
We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
arXiv Detail & Related papers (2021-12-15T20:09:20Z) - Gaussian processes meet NeuralODEs: A Bayesian framework for learning
the dynamics of partially observed systems from scarce and noisy data [0.0]
This paper presents a machine learning framework (GP-NODE) for Bayesian systems identification from partial, noisy and irregular observations of nonlinear dynamical systems.
The proposed method takes advantage of recent developments in differentiable programming to propagate gradient information through ordinary differential equation solvers.
A series of numerical studies is presented to demonstrate the effectiveness of the proposed GP-NODE method including predator-prey systems, systems biology, and a 50-dimensional human motion dynamical system.
arXiv Detail & Related papers (2021-03-04T23:42:14Z) - Using Data Assimilation to Train a Hybrid Forecast System that Combines
Machine-Learning and Knowledge-Based Components [52.77024349608834]
We consider the problem of data-assisted forecasting of chaotic dynamical systems when the available data is noisy partial measurements.
We show that by using partial measurements of the state of the dynamical system, we can train a machine learning model to improve predictions made by an imperfect knowledge-based model.
arXiv Detail & Related papers (2021-02-15T19:56:48Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Augmenting Physical Models with Deep Networks for Complex Dynamics
Forecasting [34.61959169976758]
APHYNITY is a principled approach for augmenting incomplete physical dynamics described by differential equations with deep data-driven models.
It consists in decomposing the dynamics into two components: a physical component accounting for the dynamics for which we have some prior knowledge, and a data-driven component accounting for errors of the physical model.
arXiv Detail & Related papers (2020-10-09T09:31:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.