Learning Dynamics from Input-Output Data with Hamiltonian Gaussian Processes
- URL: http://arxiv.org/abs/2511.05330v1
- Date: Fri, 07 Nov 2025 15:28:08 GMT
- Title: Learning Dynamics from Input-Output Data with Hamiltonian Gaussian Processes
- Authors: Jan-Hendrik Ewering, Robin E. Herrmann, Niklas Wahlström, Thomas B. Schön, Thomas Seel,
- Abstract summary: We consider dynamics learning with non-conservative Hamiltonian GPs.<n>We provide a fully Bayesian scheme for estimating probability densities of unknown hidden states.<n>Considering the computational complexity of GPs, we take advantage of a reduced-rank GP approximation.
- Score: 10.748284100326794
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Embedding non-restrictive prior knowledge, such as energy conservation laws, in learning-based approaches is a key motive to construct physically consistent models from limited data, relevant for, e.g., model-based control. Recent work incorporates Hamiltonian dynamics into Gaussian Process (GP) regression to obtain uncertainty-quantifying models that adhere to the underlying physical principles. However, these works rely on velocity or momentum data, which is rarely available in practice. In this paper, we consider dynamics learning with non-conservative Hamiltonian GPs, and address the more realistic problem setting of learning from input-output data. We provide a fully Bayesian scheme for estimating probability densities of unknown hidden states, of GP hyperparameters, as well as of structural hyperparameters, such as damping coefficients. Considering the computational complexity of GPs, we take advantage of a reduced-rank GP approximation and leverage its properties for computationally efficient prediction and training. The proposed method is evaluated in a nonlinear simulation case study and compared to a state-of-the-art approach that relies on momentum measurements.
Related papers
- Plug-and-Play Physics-informed Learning using Uncertainty Quantified Port-Hamiltonian Models [5.1732651331429516]
We introduce a Plug-and-Play Physics-In Machine Learning (PIML) framework to address this challenge.<n>Our method employs conformal prediction to identify outlier dynamics and switches from a nominal predictor to a physics-consistent model.<n>In this way, the proposed framework produces reliable physics-informed predictions even for the out-of-distribution scenarios.
arXiv Detail & Related papers (2025-04-24T22:25:51Z) - Efficient Transformed Gaussian Process State-Space Models for Non-Stationary High-Dimensional Dynamical Systems [49.819436680336786]
We propose an efficient transformed Gaussian process state-space model (ETGPSSM) for scalable and flexible modeling of high-dimensional, non-stationary dynamical systems.<n>Specifically, our ETGPSSM integrates a single shared GP with input-dependent normalizing flows, yielding an expressive implicit process prior that captures complex, non-stationary transition dynamics.<n>Our ETGPSSM outperforms existing GPSSMs and neural network-based SSMs in terms of computational efficiency and accuracy.
arXiv Detail & Related papers (2025-03-24T03:19:45Z) - Streamflow Prediction with Uncertainty Quantification for Water Management: A Constrained Reasoning and Learning Approach [27.984958596544278]
This paper studies a constrained reasoning and learning (CRL) approach where physical laws represented as logical constraints are integrated as a layer in the deep neural network.
To address small data setting, we develop a theoretically-grounded training approach to improve the generalization accuracy of deep models.
arXiv Detail & Related papers (2024-05-31T18:53:53Z) - Self-Consistency Training for Density-Functional-Theory Hamiltonian Prediction [74.84850523400873]
We show that Hamiltonian prediction possesses a self-consistency principle, based on which we propose self-consistency training.
It enables the model to be trained on a large amount of unlabeled data, hence addresses the data scarcity challenge.
It is more efficient than running DFT to generate labels for supervised training, since it amortizes DFT calculation over a set of queries.
arXiv Detail & Related papers (2024-03-14T16:52:57Z) - Structure-Preserving Learning Using Gaussian Processes and Variational
Integrators [62.31425348954686]
We propose the combination of a variational integrator for the nominal dynamics of a mechanical system and learning residual dynamics with Gaussian process regression.
We extend our approach to systems with known kinematic constraints and provide formal bounds on the prediction uncertainty.
arXiv Detail & Related papers (2021-12-10T11:09:29Z) - Hybrid Gaussian Process Modeling Applied to Economic Stochastic Model
Predictive Control of Batch Processes [0.0]
Plant models can often be determined from first principles, parts of the model are difficult to derive using physical laws alone.
This paper exploits GPs to model the parts of the dynamic system that are difficult to describe using first principles.
It is vital to account for this uncertainty in the control algorithm, to prevent constraint violations and performance deterioration.
arXiv Detail & Related papers (2021-08-14T00:01:42Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Gaussian Process-based Min-norm Stabilizing Controller for
Control-Affine Systems with Uncertain Input Effects and Dynamics [90.81186513537777]
We propose a novel compound kernel that captures the control-affine nature of the problem.
We show that this resulting optimization problem is convex, and we call it Gaussian Process-based Control Lyapunov Function Second-Order Cone Program (GP-CLF-SOCP)
arXiv Detail & Related papers (2020-11-14T01:27:32Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z) - Learning Constrained Dynamics with Gauss Principle adhering Gaussian
Processes [7.643999306446022]
We propose to combine insights from analytical mechanics with Gaussian process regression to improve the model's data efficiency and constraint integrity.
Our model enables to infer the acceleration of the unconstrained system from data of the constrained system.
arXiv Detail & Related papers (2020-04-23T15:26:51Z) - Transport Gaussian Processes for Regression [0.22843885788439797]
We propose a methodology to construct processes, which include GPs, warped GPs, Student-t processes and several others.
Our approach is inspired by layers-based models, where each proposed layer changes a specific property over the generated process.
We validate the proposed model through experiments with real-world data.
arXiv Detail & Related papers (2020-01-30T17:44:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.