Symplectic Gaussian Process Regression of Hamiltonian Flow Maps
- URL: http://arxiv.org/abs/2009.05569v1
- Date: Fri, 11 Sep 2020 17:56:35 GMT
- Title: Symplectic Gaussian Process Regression of Hamiltonian Flow Maps
- Authors: Katharina Rath, Christopher G. Albert, Bernd Bischl, Udo von Toussaint
- Abstract summary: We present an approach to construct appropriate and efficient emulators for Hamiltonian flow maps.
Intended future applications are long-term tracing of fast charged particles in accelerators and magnetic plasma confinement.
- Score: 0.8029049649310213
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: We present an approach to construct appropriate and efficient emulators for
Hamiltonian flow maps. Intended future applications are long-term tracing of
fast charged particles in accelerators and magnetic plasma confinement
configurations. The method is based on multi-output Gaussian process regression
on scattered training data. To obtain long-term stability the symplectic
property is enforced via the choice of the matrix-valued covariance function.
Based on earlier work on spline interpolation we observe derivatives of the
generating function of a canonical transformation. A product kernel produces an
accurate implicit method, whereas a sum kernel results in a fast explicit
method from this approach. Both correspond to a symplectic Euler method in
terms of numerical integration. These methods are applied to the pendulum and
the H\'enon-Heiles system and results compared to an symmetric regression with
orthogonal polynomials. In the limit of small mapping times, the Hamiltonian
function can be identified with a part of the generating function and thereby
learned from observed time-series data of the system's evolution. Besides
comparable performance of implicit kernel and spectral regression for
symplectic maps, we demonstrate a substantial increase in performance for
learning the Hamiltonian function compared to existing approaches.
Related papers
- Learning dissipative Hamiltonian dynamics with reproducing kernel Hilbert spaces and random Fourier features [0.7510165488300369]
This paper presents a new method for learning dissipative Hamiltonian dynamics from a limited and noisy dataset.
The performance of the method is validated in simulations for two dissipative Hamiltonian systems.
arXiv Detail & Related papers (2024-10-24T11:35:39Z) - A Structure-Preserving Kernel Method for Learning Hamiltonian Systems [3.594638299627404]
A structure-preserving kernel ridge regression method is presented that allows the recovery of potentially high-dimensional and nonlinear Hamiltonian functions.
The paper extends kernel regression methods to problems in which loss functions involving linear functions of gradients are required.
A full error analysis is conducted that provides convergence rates using fixed and adaptive regularization parameters.
arXiv Detail & Related papers (2024-03-15T07:20:21Z) - Stochastic Gradient Descent for Gaussian Processes Done Right [86.83678041846971]
We show that when emphdone right -- by which we mean using specific insights from optimisation and kernel communities -- gradient descent is highly effective.
We introduce a emphstochastic dual descent algorithm, explain its design in an intuitive manner and illustrate the design choices.
Our method places Gaussian process regression on par with state-of-the-art graph neural networks for molecular binding affinity prediction.
arXiv Detail & Related papers (2023-10-31T16:15:13Z) - Score-based Continuous-time Discrete Diffusion Models [102.65769839899315]
We extend diffusion models to discrete variables by introducing a Markov jump process where the reverse process denoises via a continuous-time Markov chain.
We show that an unbiased estimator can be obtained via simple matching the conditional marginal distributions.
We demonstrate the effectiveness of the proposed method on a set of synthetic and real-world music and image benchmarks.
arXiv Detail & Related papers (2022-11-30T05:33:29Z) - Adjoint-aided inference of Gaussian process driven differential
equations [0.8257490175399691]
We show how the adjoint of a linear system can be used to efficiently infer forcing functions modelled as GPs.
We demonstrate the approach on systems of both ordinary and partial differential equations.
arXiv Detail & Related papers (2022-02-09T17:35:14Z) - Scalable Variational Gaussian Processes via Harmonic Kernel
Decomposition [54.07797071198249]
We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability.
We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections.
Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.
arXiv Detail & Related papers (2021-06-10T18:17:57Z) - A Discrete Variational Derivation of Accelerated Methods in Optimization [68.8204255655161]
We introduce variational which allow us to derive different methods for optimization.
We derive two families of optimization methods in one-to-one correspondence.
The preservation of symplecticity of autonomous systems occurs here solely on the fibers.
arXiv Detail & Related papers (2021-06-04T20:21:53Z) - The Connection between Discrete- and Continuous-Time Descriptions of
Gaussian Continuous Processes [60.35125735474386]
We show that discretizations yielding consistent estimators have the property of invariance under coarse-graining'
This result explains why combining differencing schemes for derivatives reconstruction and local-in-time inference approaches does not work for time series analysis of second or higher order differential equations.
arXiv Detail & Related papers (2021-01-16T17:11:02Z) - SLEIPNIR: Deterministic and Provably Accurate Feature Expansion for
Gaussian Process Regression with Derivatives [86.01677297601624]
We propose a novel approach for scaling GP regression with derivatives based on quadrature Fourier features.
We prove deterministic, non-asymptotic and exponentially fast decaying error bounds which apply for both the approximated kernel as well as the approximated posterior.
arXiv Detail & Related papers (2020-03-05T14:33:20Z) - Analysis of Bayesian Inference Algorithms by the Dynamical Functional
Approach [2.8021833233819486]
We analyze an algorithm for approximate inference with large Gaussian latent variable models in a student-trivial scenario.
For the case of perfect data-model matching, the knowledge of static order parameters derived from the replica method allows us to obtain efficient algorithmic updates.
arXiv Detail & Related papers (2020-01-14T17:22:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.