Two mass-imbalanced atoms in a hard-wall trap: Deep learning
integrability of many-body systems
- URL: http://arxiv.org/abs/2402.16244v1
- Date: Mon, 26 Feb 2024 02:09:00 GMT
- Title: Two mass-imbalanced atoms in a hard-wall trap: Deep learning
integrability of many-body systems
- Authors: Liheng Lang and Qichen Lu and C. M. Dai and Xingbo Wei and Yanxia Liu
and Yunbo Zhang
- Abstract summary: We build a convolutional neural network to identify the transition points between integrable and non-integrable systems.
A brilliant example of the network's ability is to identify a new integrable mass ratio $1/3$ by learning from the known integrable case of equal mass.
The robustness of our neural networks is further enhanced by adversarial learning, where samples are generated by standard and quantum perturbations mixed in the probability density images and the wavefunctions.
- Score: 2.988602253341921
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The study of integrable systems has led to significant advancements in our
understanding of many-body physics. We design a series of numerical experiments
to analyze the integrability of a mass-imbalanced two-body system through
energy level statistics and deep learning of wavefunctions. The level spacing
distributions are fitted by a Brody distribution and the fitting parameter
$\omega$ is found to separate the integrable and non-integrable mass ratios by
a critical line $\omega=0$. The convolutional neural network built from the
probability density images could identify the transition points between
integrable and non-integrable systems with high accuracy, yet in a much shorter
computation time. A brilliant example of the network's ability is to identify a
new integrable mass ratio $1/3$ by learning from the known integrable case of
equal mass, with a remarkable network confidence of $98.78\%$. The robustness
of our neural networks is further enhanced by adversarial learning, where
samples are generated by standard and quantum perturbations mixed in the
probability density images and the wavefunctions, respectively.
Related papers
- Neural Control Variates with Automatic Integration [49.91408797261987]
This paper proposes a novel approach to construct learnable parametric control variates functions from arbitrary neural network architectures.
We use the network to approximate the anti-derivative of the integrand.
We apply our method to solve partial differential equations using the Walk-on-sphere algorithm.
arXiv Detail & Related papers (2024-09-23T06:04:28Z) - Fourier Neural Operators for Learning Dynamics in Quantum Spin Systems [77.88054335119074]
We use FNOs to model the evolution of random quantum spin systems.
We apply FNOs to a compact set of Hamiltonian observables instead of the entire $2n$ quantum wavefunction.
arXiv Detail & Related papers (2024-09-05T07:18:09Z) - Deep Neural Networks as Variational Solutions for Correlated Open
Quantum Systems [0.0]
We show that parametrizing the density matrix directly with more powerful models can yield better variational ansatz functions.
We present results for the dissipative one-dimensional transverse-field Ising model and a two-dimensional dissipative Heisenberg model.
arXiv Detail & Related papers (2024-01-25T13:41:34Z) - ShadowNet for Data-Centric Quantum System Learning [188.683909185536]
We propose a data-centric learning paradigm combining the strength of neural-network protocols and classical shadows.
Capitalizing on the generalization power of neural networks, this paradigm can be trained offline and excel at predicting previously unseen systems.
We present the instantiation of our paradigm in quantum state tomography and direct fidelity estimation tasks and conduct numerical analysis up to 60 qubits.
arXiv Detail & Related papers (2023-08-22T09:11:53Z) - Uncovering Local Integrability in Quantum Many-Body Dynamics [0.0]
Using up to 124 qubits of a fully programmable quantum computer, we uncover local conservation laws and integrability in one- and two-dimensional periodically-driven spin lattices.
Our results demonstrate a versatile strategy for extracting hidden dynamical structure from noisy experiments on large-scale quantum computers.
arXiv Detail & Related papers (2023-07-14T18:00:05Z) - Finding the Dynamics of an Integrable Quantum Many-Body System via
Machine Learning [0.0]
We study the dynamics of the Gaudin magnet ("central-spin model") using machine-learning methods.
Motivated in part by this intuition, we use a neural-network representation for each variational eigenstate of the model Hamiltonian.
Having an efficient description of this susceptibility opens the door to improved characterization and quantum control procedures for qubits interacting with an environment of quantum two-level systems.
arXiv Detail & Related papers (2023-07-06T21:49:01Z) - Computational Complexity of Learning Neural Networks: Smoothness and
Degeneracy [52.40331776572531]
We show that learning depth-$3$ ReLU networks under the Gaussian input distribution is hard even in the smoothed-analysis framework.
Our results are under a well-studied assumption on the existence of local pseudorandom generators.
arXiv Detail & Related papers (2023-02-15T02:00:26Z) - A quantum algorithm for training wide and deep classical neural networks [72.2614468437919]
We show that conditions amenable to classical trainability via gradient descent coincide with those necessary for efficiently solving quantum linear systems.
We numerically demonstrate that the MNIST image dataset satisfies such conditions.
We provide empirical evidence for $O(log n)$ training of a convolutional neural network with pooling.
arXiv Detail & Related papers (2021-07-19T23:41:03Z) - Morphology of three-body quantum states from machine learning [18.56475227525833]
We show that a triangular quantum billiard can be integrable or non-integrable.
We use machine learning tools to analyze properties of probability distributions of individual quantum states.
We find that convolutional neural networks can correctly classify integrable and non-integrable states.
arXiv Detail & Related papers (2021-02-09T17:23:08Z) - Stable Recovery of Entangled Weights: Towards Robust Identification of
Deep Neural Networks from Minimal Samples [0.0]
We introduce the so-called entangled weights, which compose weights of successive layers intertwined with suitable diagonal and invertible matrices depending on the activation functions and their shifts.
We prove that entangled weights are completely and stably approximated by an efficient and robust algorithm.
In terms of practical impact, our study shows that we can relate input-output information uniquely and stably to network parameters, providing a form of explainability.
arXiv Detail & Related papers (2021-01-18T16:31:19Z) - Neural Control Variates [71.42768823631918]
We show that a set of neural networks can face the challenge of finding a good approximation of the integrand.
We derive a theoretically optimal, variance-minimizing loss function, and propose an alternative, composite loss for stable online training in practice.
Specifically, we show that the learned light-field approximation is of sufficient quality for high-order bounces, allowing us to omit the error correction and thereby dramatically reduce the noise at the cost of negligible visible bias.
arXiv Detail & Related papers (2020-06-02T11:17:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.