Deep Learning based discovery of Integrable Systems
- URL: http://arxiv.org/abs/2503.10469v2
- Date: Sun, 16 Mar 2025 15:48:53 GMT
- Title: Deep Learning based discovery of Integrable Systems
- Authors: Shailesh Lal, Suvajit Majumder, Evgeny Sobko,
- Abstract summary: We introduce a novel machine learning based framework for discovering integrable models.<n>Our approach first employs a synchronized ensemble of neural networks to find high-precision numerical solution to the Yang-Baxter equation.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce a novel machine learning based framework for discovering integrable models. Our approach first employs a synchronized ensemble of neural networks to find high-precision numerical solution to the Yang-Baxter equation within a specified class. Then, using an auxiliary system of algebraic equations, [Q_2, Q_3] = 0, and the numerical value of the Hamiltonian obtained via deep learning as a seed, we reconstruct the entire Hamiltonian family, forming an algebraic variety. We illustrate our presentation with three- and four-dimensional spin chains of difference form with local interactions. Remarkably, all discovered Hamiltonian families form rational varieties.
Related papers
- Novel approach of exploring ASEP-like models through the Yang Baxter
Equation [49.1574468325115]
Ansatz of Yang Baxter Equation inspired by Bethe Ansatz treatment of ASEP spin-model.
Various classes of Hamiltonian density arriving from two types of R-Matrices are found which also appear as solutions of constant YBE.
A summary of finalised results reveals general non-hermitian spin-1/2 chain models.
arXiv Detail & Related papers (2024-03-05T17:52:20Z) - Separable Hamiltonian Neural Networks [1.8674308456443722]
Hamiltonian neural networks (HNNs) are state-of-the-art models that regress the vector field of a dynamical system.
We propose separable HNNs that embed additive separability within HNNs using observational, learning, and inductive biases.
arXiv Detail & Related papers (2023-09-03T03:54:43Z) - The R-mAtrIx Net [0.0]
We provide a novel Neural Network architecture that can output R-matrix for a given quantum integrable spin chain.
We also explore the space of Hamiltonians around already learned models and reconstruct the family of integrable spin chains which they belong to.
arXiv Detail & Related papers (2023-04-14T16:50:42Z) - Deep Learning Symmetries and Their Lie Groups, Algebras, and Subalgebras
from First Principles [55.41644538483948]
We design a deep-learning algorithm for the discovery and identification of the continuous group of symmetries present in a labeled dataset.
We use fully connected neural networks to model the transformations symmetry and the corresponding generators.
Our study also opens the door for using a machine learning approach in the mathematical study of Lie groups and their properties.
arXiv Detail & Related papers (2023-01-13T16:25:25Z) - Solving quantum dynamics with a Lie algebra decoupling method [0.0]
We present a pedagogical introduction to solving the dynamics of quantum systems by the use of a Lie algebra decoupling theorem.
As background, we include an overview of Lie groups and Lie algebras aimed at a general physicist audience.
We prove the theorem and apply it to three well-known examples of linear and quadratic Hamiltonian that frequently appear in quantum optics and related fields.
arXiv Detail & Related papers (2022-10-21T11:44:24Z) - Learning Trajectories of Hamiltonian Systems with Neural Networks [81.38804205212425]
We propose to enhance Hamiltonian neural networks with an estimation of a continuous-time trajectory of the modeled system.
We demonstrate that the proposed integration scheme works well for HNNs, especially with low sampling rates, noisy and irregular observations.
arXiv Detail & Related papers (2022-04-11T13:25:45Z) - Proofs of network quantum nonlocality aided by machine learning [68.8204255655161]
We show that the family of quantum triangle distributions of [DOI40103/PhysRevLett.123.140] did not admit triangle-local models in a larger range than the original proof.
We produce a large collection of network Bell inequalities for the triangle scenario with binary outcomes, which are of independent interest.
arXiv Detail & Related papers (2022-03-30T18:00:00Z) - Learning Neural Hamiltonian Dynamics: A Methodological Overview [109.40968389896639]
Hamiltonian dynamics endows neural networks with accurate long-term prediction, interpretability, and data-efficient learning.
We systematically survey recently proposed Hamiltonian neural network models, with a special emphasis on methodologies.
arXiv Detail & Related papers (2022-02-28T22:54:39Z) - Learning Hamiltonians of constrained mechanical systems [0.0]
Hamiltonian systems are an elegant and compact formalism in classical mechanics.
We propose new approaches for the accurate approximation of the Hamiltonian function of constrained mechanical systems.
arXiv Detail & Related papers (2022-01-31T14:03:17Z) - Symplectic Learning for Hamiltonian Neural Networks [0.0]
Hamiltonian Neural Networks (HNNs) took a first step towards a unified "gray box" approach.
We exploit the symplectic structure of Hamiltonian systems with a different loss function.
We mathematically guarantee the existence of an exact Hamiltonian function which the HNN can learn.
arXiv Detail & Related papers (2021-06-22T13:33:12Z) - Stochastic Hamiltonian Gradient Methods for Smooth Games [51.47367707388402]
We focus on the class of Hamiltonian methods and provide the first convergence guarantees for certain classes of smooth games.
Using tools from the optimization literature we show that SHGD converges linearly to the neighbourhood of a gradient.
Our results provide the first global non-asymotic last-rate convergence guarantees for the class of general games.
arXiv Detail & Related papers (2020-07-08T15:42:13Z) - Hamiltonian neural networks for solving equations of motion [3.1498833540989413]
We present a Hamiltonian neural network that solves the differential equations that govern dynamical systems.
A symplectic Euler integrator requires two orders more evaluation points than the Hamiltonian network in order to achieve the same order of the numerical error.
arXiv Detail & Related papers (2020-01-29T21:48:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.