Fluctuation based interpretable analysis scheme for quantum many-body
snapshots
- URL: http://arxiv.org/abs/2304.06029v2
- Date: Tue, 27 Jun 2023 11:30:50 GMT
- Title: Fluctuation based interpretable analysis scheme for quantum many-body
snapshots
- Authors: Henning Schl\"omer, Annabelle Bohrdt
- Abstract summary: Microscopically understanding and classifying phases of matter is at the heart of strongly-correlated quantum physics.
Here, we combine confusion learning with correlation convolutional neural networks, which yields fully interpretable phase detection.
Our work opens new directions in interpretable quantum image processing being sensible to long-range order.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Microscopically understanding and classifying phases of matter is at the
heart of strongly-correlated quantum physics. With quantum simulations, genuine
projective measurements (snapshots) of the many-body state can be taken, which
include the full information of correlations in the system. The rise of deep
neural networks has made it possible to routinely solve abstract processing and
classification tasks of large datasets, which can act as a guiding hand for
quantum data analysis. However, though proven to be successful in
differentiating between different phases of matter, conventional neural
networks mostly lack interpretability on a physical footing. Here, we combine
confusion learning with correlation convolutional neural networks, which yields
fully interpretable phase detection in terms of correlation functions. In
particular, we study thermodynamic properties of the 2D Heisenberg model,
whereby the trained network is shown to pick up qualitative changes in the
snapshots above and below a characteristic temperature where magnetic
correlations become significantly long-range. We identify the full counting
statistics of nearest neighbor spin correlations as the most important quantity
for the decision process of the neural network, which go beyond averages of
local observables. With access to the fluctuations of second-order correlations
-- which indirectly include contributions from higher order, long-range
correlations -- the network is able to detect changes of the specific heat and
spin susceptibility, the latter being in analogy to magnetic properties of the
pseudogap phase in high-temperature superconductors. By combining the confusion
learning scheme with transformer neural networks, our work opens new directions
in interpretable quantum image processing being sensible to long-range order.
Related papers
- Assessing Neural Network Representations During Training Using
Noise-Resilient Diffusion Spectral Entropy [55.014926694758195]
Entropy and mutual information in neural networks provide rich information on the learning process.
We leverage data geometry to access the underlying manifold and reliably compute these information-theoretic measures.
We show that they form noise-resistant measures of intrinsic dimensionality and relationship strength in high-dimensional simulated data.
arXiv Detail & Related papers (2023-12-04T01:32:42Z) - Addressing caveats of neural persistence with deep graph persistence [54.424983583720675]
We find that the variance of network weights and spatial concentration of large weights are the main factors that impact neural persistence.
We propose an extension of the filtration underlying neural persistence to the whole neural network instead of single layers.
This yields our deep graph persistence measure, which implicitly incorporates persistent paths through the network and alleviates variance-related issues.
arXiv Detail & Related papers (2023-07-20T13:34:11Z) - Deep learning of spatial densities in inhomogeneous correlated quantum
systems [0.0]
We show that we can learn to predict densities using convolutional neural networks trained on random potentials.
We show that our approach can handle well the interplay of interference and interactions and the behaviour of models with phase transitions in inhomogeneous situations.
arXiv Detail & Related papers (2022-11-16T17:10:07Z) - Momentum Diminishes the Effect of Spectral Bias in Physics-Informed
Neural Networks [72.09574528342732]
Physics-informed neural network (PINN) algorithms have shown promising results in solving a wide range of problems involving partial differential equations (PDEs)
They often fail to converge to desirable solutions when the target function contains high-frequency features, due to a phenomenon known as spectral bias.
In the present work, we exploit neural tangent kernels (NTKs) to investigate the training dynamics of PINNs evolving under gradient descent with momentum (SGDM)
arXiv Detail & Related papers (2022-06-29T19:03:10Z) - Learning topological defects formation with neural networks in a quantum
phase transition [0.0]
We investigate the time evolutions, universal statistics, and correlations of topological defects in a one-dimensional transverse-field quantum Ising model.
We establish a universal power-law relationship between the first three cumulants of the kink numbers and the quench rate, indicating a binomial distribution of the kinks.
Finally, the normalized kink-kink correlations are also investigated and it is found that the numerical values are consistent with the analytic formula.
arXiv Detail & Related papers (2022-04-14T06:00:19Z) - Decomposing neural networks as mappings of correlation functions [57.52754806616669]
We study the mapping between probability distributions implemented by a deep feed-forward network.
We identify essential statistics in the data, as well as different information representations that can be used by neural networks.
arXiv Detail & Related papers (2022-02-10T09:30:31Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Neural network is heterogeneous: Phase matters more [10.812772606528172]
In complex-valued neural networks, we show that among different types of pruning, the weight matrix with only phase information preserved achieves the best accuracy.
The conclusion can be generalized to real-valued neural networks, where signs take the place of phases.
arXiv Detail & Related papers (2021-11-03T04:30:20Z) - Towards quantifying information flows: relative entropy in deep neural
networks and the renormalization group [0.0]
We quantify the flow of information by explicitly computing the relative entropy or Kullback-Leibler divergence.
For the neural networks, the behavior may have implications for various information methods in machine learning.
arXiv Detail & Related papers (2021-07-14T18:00:01Z) - Learning Connectivity of Neural Networks from a Topological Perspective [80.35103711638548]
We propose a topological perspective to represent a network into a complete graph for analysis.
By assigning learnable parameters to the edges which reflect the magnitude of connections, the learning process can be performed in a differentiable manner.
This learning process is compatible with existing networks and owns adaptability to larger search spaces and different tasks.
arXiv Detail & Related papers (2020-08-19T04:53:31Z) - Probing Criticality in Quantum Spin Chains with Neural Networks [0.0]
We show that even neural networks with no hidden layers can be effectively trained to distinguish between magnetically ordered and disordered phases.
Our results extend to a wide class of interacting quantum many-body systems and illustrate the wide applicability of neural networks to many-body quantum physics.
arXiv Detail & Related papers (2020-05-05T12:34:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.