Investigating Topological Order using Recurrent Neural Networks
- URL: http://arxiv.org/abs/2303.11207v3
- Date: Wed, 25 Oct 2023 23:32:02 GMT
- Title: Investigating Topological Order using Recurrent Neural Networks
- Authors: Mohamed Hibat-Allah, Roger G. Melko, Juan Carrasquilla
- Abstract summary: We employ 2D RNNs to investigate two prototypical quantum many-body Hamiltonians exhibiting topological order.
Specifically, we demonstrate that RNN wave functions can effectively capture the topological order of the toric code and a Bose-Hubbard spin liquid on the kagome lattice.
- Score: 0.7234862895932991
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recurrent neural networks (RNNs), originally developed for natural language
processing, hold great promise for accurately describing strongly correlated
quantum many-body systems. Here, we employ 2D RNNs to investigate two
prototypical quantum many-body Hamiltonians exhibiting topological order.
Specifically, we demonstrate that RNN wave functions can effectively capture
the topological order of the toric code and a Bose-Hubbard spin liquid on the
kagome lattice by estimating their topological entanglement entropies. We also
find that RNNs favor coherent superpositions of minimally-entangled states over
minimally-entangled states themselves. Overall, our findings demonstrate that
RNN wave functions constitute a powerful tool to study phases of matter beyond
Landau's symmetry-breaking paradigm.
Related papers
- Fourier Neural Operators for Learning Dynamics in Quantum Spin Systems [77.88054335119074]
We use FNOs to model the evolution of random quantum spin systems.
We apply FNOs to a compact set of Hamiltonian observables instead of the entire $2n$ quantum wavefunction.
arXiv Detail & Related papers (2024-09-05T07:18:09Z) - Neural network approach to quasiparticle dispersions in doped
antiferromagnets [0.0]
We study the ability of neural quantum states to represent the bosonic and fermionic $t-J$ model on different 1D and 2D lattices.
We present a method to calculate dispersion relations from the neural network state representation.
arXiv Detail & Related papers (2023-10-12T17:59:33Z) - Universal Approximation and the Topological Neural Network [0.0]
A topological neural network (TNN) takes data from a Tychonoff topological space instead of the usual finite dimensional space.
A distributional neural network (DNN) that takes Borel measures as data is also introduced.
arXiv Detail & Related papers (2023-05-26T05:28:10Z) - Wavelet Neural Networks versus Wavelet-based Neural Networks [0.0]
We introduce a new type of neural networks (NNs) -- wavelet-based neural networks (WBNNs) -- and study their properties and potential for applications.
We show that WBNNs vastly outperform the existing type of wavelet neural networks (WNNs)
arXiv Detail & Related papers (2022-11-01T11:41:19Z) - Supplementing Recurrent Neural Network Wave Functions with Symmetry and
Annealing to Improve Accuracy [0.7234862895932991]
Recurrent neural networks (RNNs) are a class of neural networks that have emerged from the paradigm of artificial intelligence.
We show that our method is superior to Density Matrix Renormalisation Group (DMRG) for system sizes larger than or equal to $14 times 14$ on the triangular lattice.
arXiv Detail & Related papers (2022-07-28T18:00:03Z) - The Spectral Bias of Polynomial Neural Networks [63.27903166253743]
Polynomial neural networks (PNNs) have been shown to be particularly effective at image generation and face recognition, where high-frequency information is critical.
Previous studies have revealed that neural networks demonstrate a $textitspectral bias$ towards low-frequency functions, which yields faster learning of low-frequency components during training.
Inspired by such studies, we conduct a spectral analysis of the Tangent Kernel (NTK) of PNNs.
We find that the $Pi$-Net family, i.e., a recently proposed parametrization of PNNs, speeds up the
arXiv Detail & Related papers (2022-02-27T23:12:43Z) - Toward Trainability of Quantum Neural Networks [87.04438831673063]
Quantum Neural Networks (QNNs) have been proposed as generalizations of classical neural networks to achieve the quantum speed-up.
Serious bottlenecks exist for training QNNs due to the vanishing with gradient rate exponential to the input qubit number.
We show that QNNs with tree tensor and step controlled structures for the application of binary classification. Simulations show faster convergent rates and better accuracy compared to QNNs with random structures.
arXiv Detail & Related papers (2020-11-12T08:32:04Z) - Variational Monte Carlo calculations of $\mathbf{A\leq 4}$ nuclei with
an artificial neural-network correlator ansatz [62.997667081978825]
We introduce a neural-network quantum state ansatz to model the ground-state wave function of light nuclei.
We compute the binding energies and point-nucleon densities of $Aleq 4$ nuclei as emerging from a leading-order pionless effective field theory Hamiltonian.
arXiv Detail & Related papers (2020-07-28T14:52:28Z) - Deep Neural Networks as the Semi-classical Limit of Quantum Neural
Networks [0.0]
Quantum Neural Networks (QNN) can be mapped onto spinnetworks.
Deep Neural Networks (DNN) are a subcase of QNN.
A number of Machine Learning (ML) key-concepts can be rephrased by using the terminology of Topological Quantum Field Theories (TQFT)
arXiv Detail & Related papers (2020-06-30T22:47:26Z) - Graph Neural Networks for Motion Planning [108.51253840181677]
We present two techniques, GNNs over dense fixed graphs for low-dimensional problems and sampling-based GNNs for high-dimensional problems.
We examine the ability of a GNN to tackle planning problems such as identifying critical nodes or learning the sampling distribution in Rapidly-exploring Random Trees (RRT)
Experiments with critical sampling, a pendulum and a six DoF robot arm show GNNs improve on traditional analytic methods as well as learning approaches using fully-connected or convolutional neural networks.
arXiv Detail & Related papers (2020-06-11T08:19:06Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.