Learning Hamiltonian Dynamics with Reproducing Kernel Hilbert Spaces and Random Features
- URL: http://arxiv.org/abs/2404.07703v1
- Date: Thu, 11 Apr 2024 12:49:30 GMT
- Title: Learning Hamiltonian Dynamics with Reproducing Kernel Hilbert Spaces and Random Features
- Authors: Torbjørn Smith, Olav Egeland,
- Abstract summary: A method for learning Hamiltonian dynamics from a limited and noisy dataset is proposed.
The method learns a Hamiltonian vector field on a reproducing kernel Hilbert space (RKHS) of inherently Hamiltonian vector fields.
It is demonstrated that the use of an odd symplectic kernel improves prediction accuracy.
- Score: 0.7510165488300369
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: A method for learning Hamiltonian dynamics from a limited and noisy dataset is proposed. The method learns a Hamiltonian vector field on a reproducing kernel Hilbert space (RKHS) of inherently Hamiltonian vector fields, and in particular, odd Hamiltonian vector fields. This is done with a symplectic kernel, and it is shown how the kernel can be modified to an odd symplectic kernel to impose the odd symmetry. A random feature approximation is developed for the proposed kernel to reduce the problem size. This includes random feature approximations for odd kernels. The performance of the method is validated in simulations for three Hamiltonian systems. It is demonstrated that the use of an odd symplectic kernel improves prediction accuracy, and that the learned vector fields are Hamiltonian and exhibit the imposed odd symmetry characteristics.
Related papers
- Learning of Hamiltonian Dynamics with Reproducing Kernel Hilbert Spaces [0.844067337858849]
This paper presents a method for learning Hamiltonian dynamics from a limited set of data points.
It is shown that the learned dynamics are Hamiltonian, and that the learned Hamiltonian vector field can be prescribed to be odd or even.
arXiv Detail & Related papers (2023-12-15T12:19:48Z) - Wiener Chaos in Kernel Regression: Towards Untangling Aleatoric and
Epistemic Uncertainty [0.0]
We consider kernel ridge regression with additive i.i.d. non-Gaussian measurement noise.
Considering a system as numerical example, we show that our approach allows to untangle the effects of Hilbert and aleatoric uncertainties.
arXiv Detail & Related papers (2023-12-12T16:02:35Z) - Vectorization of the density matrix and quantum simulation of the von
Neumann equation of time-dependent Hamiltonians [65.268245109828]
We develop a general framework to linearize the von-Neumann equation rendering it in a suitable form for quantum simulations.
We show that one of these linearizations of the von-Neumann equation corresponds to the standard case in which the state vector becomes the column stacked elements of the density matrix.
A quantum algorithm to simulate the dynamics of the density matrix is proposed.
arXiv Detail & Related papers (2023-06-14T23:08:51Z) - Topological spin excitations in non-Hermitian spin chains with a
generalized kernel polynomial algorithm [1.054316838380053]
We show a numerical approach to compute spectral functions of a non-Hermitian many-body system.
We show that the local spectral functions reveal topological spin excitations in a non-Hermitian spin model.
Our method offers an efficient way to compute local spectral functions in non-Hermitian quantum many-body models.
arXiv Detail & Related papers (2022-08-12T18:00:07Z) - NeuralEF: Deconstructing Kernels by Deep Neural Networks [47.54733625351363]
Traditional nonparametric solutions based on the Nystr"om formula suffer from scalability issues.
Recent work has resorted to a parametric approach, i.e., training neural networks to approximate the eigenfunctions.
We show that these problems can be fixed by using a new series of objective functions that generalizes to space of supervised and unsupervised learning problems.
arXiv Detail & Related papers (2022-04-30T05:31:07Z) - Hamiltonian simulation with random inputs [74.82351543483588]
Theory of average-case performance of Hamiltonian simulation with random initial states.
Numerical evidence suggests that this theory accurately characterizes the average error for concrete models.
arXiv Detail & Related papers (2021-11-08T19:08:42Z) - Nuclei with up to $\boldsymbol{A=6}$ nucleons with artificial neural
network wave functions [52.77024349608834]
We use artificial neural networks to compactly represent the wave functions of nuclei.
We benchmark their binding energies, point-nucleon densities, and radii with the highly accurate hyperspherical harmonics method.
arXiv Detail & Related papers (2021-08-15T23:02:39Z) - Scalable Variational Gaussian Processes via Harmonic Kernel
Decomposition [54.07797071198249]
We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability.
We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections.
Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.
arXiv Detail & Related papers (2021-06-10T18:17:57Z) - High-Dimensional Gaussian Process Inference with Derivatives [90.8033626920884]
We show that in the low-data regime $ND$, the Gram matrix can be decomposed in a manner that reduces the cost of inference to $mathcalO(N2D + (N2)3)$.
We demonstrate this potential in a variety of tasks relevant for machine learning, such as optimization and Hamiltonian Monte Carlo with predictive gradients.
arXiv Detail & Related papers (2021-02-15T13:24:41Z) - Learning interaction kernels in mean-field equations of 1st-order
systems of interacting particles [1.776746672434207]
We introduce a nonparametric algorithm to learn interaction kernels of mean-field equations for 1st-order systems of interacting particles.
By at least squares with regularization, the algorithm learns the kernel on data-adaptive hypothesis spaces efficiently.
arXiv Detail & Related papers (2020-10-29T15:37:17Z) - Symplectic Gaussian Process Regression of Hamiltonian Flow Maps [0.8029049649310213]
We present an approach to construct appropriate and efficient emulators for Hamiltonian flow maps.
Intended future applications are long-term tracing of fast charged particles in accelerators and magnetic plasma confinement.
arXiv Detail & Related papers (2020-09-11T17:56:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.