Transfer learning in predicting quantum many-body dynamics: from physical observables to entanglement entropy
- URL: http://arxiv.org/abs/2405.16254v1
- Date: Sat, 25 May 2024 14:32:21 GMT
- Title: Transfer learning in predicting quantum many-body dynamics: from physical observables to entanglement entropy
- Authors: Philipp Schmidt, Florian Marquardt, Naeimeh Mohseni,
- Abstract summary: We show the capacity of a neural network that was trained on a subset of physical observables of a many-body system to partially acquire an implicit representation of the wave function.
In particular, we focus on how the pre-trained neural network can enhance the learning of entanglement entropy.
- Score: 0.6581635937019595
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Deep neural networks have demonstrated remarkable efficacy in extracting meaningful representations from complex datasets. This has propelled representation learning as a compelling area of research across diverse fields. One interesting open question is how beneficial representation learning can be for quantum many-body physics, with its notouriosly high-dimensional state space. In this work, we showcase the capacity of a neural network that was trained on a subset of physical observables of a many-body system to partially acquire an implicit representation of the wave function. We illustrate this by demonstrating the effectiveness of reusing the representation learned by the neural network to enhance the learning process of another quantity derived from the quantum state. In particular, we focus on how the pre-trained neural network can enhance the learning of entanglement entropy. This is of particular interest as directly measuring the entanglement in a many-body system is very challenging, while a subset of physical observables can be easily measured in experiments. We show the pre-trained neural network learns the dynamics of entropy with fewer resources and higher precision in comparison with direct training on the entanglement entropy.
Related papers
- Deep Quantum Graph Dreaming: Deciphering Neural Network Insights into
Quantum Experiments [0.5242869847419834]
We use a technique called $inception$ or $deep$ $dreaming$ to explore what neural networks learn about quantum optics experiments.
Our story begins by training deep neural networks on the properties of quantum systems.
We find that the network can shift the initial distribution of properties of the quantum system, and we can conceptualize the learned strategies of the neural network.
arXiv Detail & Related papers (2023-09-13T16:13:54Z) - ShadowNet for Data-Centric Quantum System Learning [188.683909185536]
We propose a data-centric learning paradigm combining the strength of neural-network protocols and classical shadows.
Capitalizing on the generalization power of neural networks, this paradigm can be trained offline and excel at predicting previously unseen systems.
We present the instantiation of our paradigm in quantum state tomography and direct fidelity estimation tasks and conduct numerical analysis up to 60 qubits.
arXiv Detail & Related papers (2023-08-22T09:11:53Z) - Deep learning of many-body observables and quantum information scrambling [0.0]
We explore how the capacity of data-driven deep neural networks in learning the dynamics of physical observables is correlated with the scrambling of quantum information.
We train a neural network to find a mapping from the parameters of a model to the evolution of observables in random quantum circuits.
We show that a particular type of recurrent neural network is extremely powerful in generalizing its predictions within the system size and time window that it has been trained on for both, localized and scrambled regimes.
arXiv Detail & Related papers (2023-02-09T13:14:10Z) - Scalable approach to many-body localization via quantum data [69.3939291118954]
Many-body localization is a notoriously difficult phenomenon from quantum many-body physics.
We propose a flexible neural network based learning approach that circumvents any computationally expensive step.
Our approach can be applied to large-scale quantum experiments to provide new insights into quantum many-body physics.
arXiv Detail & Related papers (2022-02-17T19:00:09Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Deep Learning of Quantum Many-Body Dynamics via Random Driving [0.0]
We show the power of deep learning to predict the dynamics of a quantum many-body system.
We show the network is able to extrapolate the dynamics to times longer than those it has been trained on.
arXiv Detail & Related papers (2021-05-01T22:46:42Z) - A neural anisotropic view of underspecification in deep learning [60.119023683371736]
We show that the way neural networks handle the underspecification of problems is highly dependent on the data representation.
Our results highlight that understanding the architectural inductive bias in deep learning is fundamental to address the fairness, robustness, and generalization of these systems.
arXiv Detail & Related papers (2021-04-29T14:31:09Z) - Analyzing non-equilibrium quantum states through snapshots with
artificial neural networks [0.0]
Current quantum simulation experiments are starting to explore non-equilibrium many-body dynamics in previously inaccessible regimes.
Using machine learning techniques, we investigate the dynamics and in particular the thermalization behavior of an interacting quantum system.
A neural network is trained to distinguish non-equilibrium from thermal equilibrium data, and the network performance serves as a probe for the thermalization behavior of the system.
arXiv Detail & Related papers (2020-12-21T18:59:21Z) - Complexity for deep neural networks and other characteristics of deep
feature representations [0.0]
We define a notion of complexity, which quantifies the nonlinearity of the computation of a neural network.
We investigate these observables both for trained networks as well as explore their dynamics during training.
arXiv Detail & Related papers (2020-06-08T17:59:30Z) - Parsimonious neural networks learn interpretable physical laws [77.34726150561087]
We propose parsimonious neural networks (PNNs) that combine neural networks with evolutionary optimization to find models that balance accuracy with parsimony.
The power and versatility of the approach is demonstrated by developing models for classical mechanics and to predict the melting temperature of materials from fundamental properties.
arXiv Detail & Related papers (2020-05-08T16:15:47Z) - Machine learning transfer efficiencies for noisy quantum walks [62.997667081978825]
We show that the process of finding requirements on both a graph type and a quantum system coherence can be automated.
The automation is done by using a convolutional neural network of a particular type that learns to understand with which network and under which coherence requirements quantum advantage is possible.
Our results are of importance for demonstration of advantage in quantum experiments and pave the way towards automating scientific research and discoveries.
arXiv Detail & Related papers (2020-01-15T18:36:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.