Physics-Informed Variational State-Space Gaussian Processes
- URL: http://arxiv.org/abs/2409.13876v1
- Date: Fri, 20 Sep 2024 20:12:11 GMT
- Title: Physics-Informed Variational State-Space Gaussian Processes
- Authors: Oliver Hamelijnck, Arno Solin, Theodoros Damoulas,
- Abstract summary: We introduce a variational-temporal state-temporal GP that handles linear and non-linear physical constraints while achieving efficient linear-in-time costs.
We demonstrate our methods in a range of synthetic and real-world settings and outperform the current state-of-the-art in both predictive and computational performance.
- Score: 23.57905861783904
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Differential equations are important mechanistic models that are integral to many scientific and engineering applications. With the abundance of available data there has been a growing interest in data-driven physics-informed models. Gaussian processes (GPs) are particularly suited to this task as they can model complex, non-linear phenomena whilst incorporating prior knowledge and quantifying uncertainty. Current approaches have found some success but are limited as they either achieve poor computational scalings or focus only on the temporal setting. This work addresses these issues by introducing a variational spatio-temporal state-space GP that handles linear and non-linear physical constraints while achieving efficient linear-in-time computation costs. We demonstrate our methods in a range of synthetic and real-world settings and outperform the current state-of-the-art in both predictive and computational performance.
Related papers
- Computation-Aware Gaussian Processes: Model Selection And Linear-Time Inference [55.150117654242706]
We show that model selection for computation-aware GPs trained on 1.8 million data points can be done within a few hours on a single GPU.
As a result of this work, Gaussian processes can be trained on large-scale datasets without significantly compromising their ability to quantify uncertainty.
arXiv Detail & Related papers (2024-11-01T21:11:48Z) - Data-Driven Computing Methods for Nonlinear Physics Systems with Geometric Constraints [0.7252027234425334]
This paper introduces a novel, data-driven framework that synergizes physics-based priors with advanced machine learning techniques.
Our framework showcases four algorithms, each embedding a specific physics-based prior tailored to a particular class of nonlinear systems.
The integration of these priors also enhances the expressive power of neural networks, enabling them to capture complex patterns typical in physical phenomena.
arXiv Detail & Related papers (2024-06-20T23:10:41Z) - Equivariant Graph Neural Operator for Modeling 3D Dynamics [148.98826858078556]
We propose Equivariant Graph Neural Operator (EGNO) to directly models dynamics as trajectories instead of just next-step prediction.
EGNO explicitly learns the temporal evolution of 3D dynamics where we formulate the dynamics as a function over time and learn neural operators to approximate it.
Comprehensive experiments in multiple domains, including particle simulations, human motion capture, and molecular dynamics, demonstrate the significantly superior performance of EGNO against existing methods.
arXiv Detail & Related papers (2024-01-19T21:50:32Z) - Neural Operators for Accelerating Scientific Simulations and Design [85.89660065887956]
An AI framework, known as Neural Operators, presents a principled framework for learning mappings between functions defined on continuous domains.
Neural Operators can augment or even replace existing simulators in many applications, such as computational fluid dynamics, weather forecasting, and material modeling.
arXiv Detail & Related papers (2023-09-27T00:12:07Z) - On the Integration of Physics-Based Machine Learning with Hierarchical
Bayesian Modeling Techniques [0.0]
This paper proposes to embed mechanics-based models into the mean function of a Gaussian Process (GP) model and characterize potential discrepancies through kernel machines.
The stationarity of the kernel function is a difficult hurdle in the sequential processing of long data sets, resolved through hierarchical Bayesian techniques.
Using numerical and experimental examples, potential applications of the proposed method to structural dynamics inverse problems are demonstrated.
arXiv Detail & Related papers (2023-03-01T02:29:41Z) - Neural Operator: Is data all you need to model the world? An insight
into the impact of Physics Informed Machine Learning [13.050410285352605]
We provide an insight into how data-driven approaches can complement conventional techniques to solve engineering and physics problems.
We highlight a novel and fast machine learning-based approach to learning the solution operator of a PDE operator learning.
arXiv Detail & Related papers (2023-01-30T23:29:33Z) - Physics Informed RNN-DCT Networks for Time-Dependent Partial
Differential Equations [62.81701992551728]
We present a physics-informed framework for solving time-dependent partial differential equations.
Our model utilizes discrete cosine transforms to encode spatial and recurrent neural networks.
We show experimental results on the Taylor-Green vortex solution to the Navier-Stokes equations.
arXiv Detail & Related papers (2022-02-24T20:46:52Z) - Learning continuous models for continuous physics [94.42705784823997]
We develop a test based on numerical analysis theory to validate machine learning models for science and engineering applications.
Our results illustrate how principled numerical analysis methods can be coupled with existing ML training/testing methodologies to validate models for science and engineering applications.
arXiv Detail & Related papers (2022-02-17T07:56:46Z) - Compositional Modeling of Nonlinear Dynamical Systems with ODE-based
Random Features [0.0]
We present a novel, domain-agnostic approach to tackling this problem.
We use compositions of physics-informed random features, derived from ordinary differential equations.
We find that our approach achieves comparable performance to a number of other probabilistic models on benchmark regression tasks.
arXiv Detail & Related papers (2021-06-10T17:55:13Z) - Hard Encoding of Physics for Learning Spatiotemporal Dynamics [8.546520029145853]
We propose a deep learning architecture that forcibly encodes known physics knowledge to facilitate learning in a data-driven manner.
The coercive encoding mechanism of physics, which is fundamentally different from the penalty-based physics-informed learning, ensures the network to rigorously obey given physics.
arXiv Detail & Related papers (2021-05-02T21:40:39Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.