Replacing neural networks by optimal analytical predictors for the
detection of phase transitions
- URL: http://arxiv.org/abs/2203.06084v2
- Date: Sat, 2 Jul 2022 20:01:46 GMT
- Title: Replacing neural networks by optimal analytical predictors for the
detection of phase transitions
- Authors: Julian Arnold and Frank Sch\"afer
- Abstract summary: We derive analytical expressions for the optimal output of three widely used NN-based methods for detecting phase transitions.
The inner workings of the considered methods are revealed through the explicit dependence of the optimal output on the input data.
Our theoretical results are supported by extensive numerical simulations covering, e.g., topological, quantum, and many-body localization phase transitions.
- Score: 0.10152838128195464
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Identifying phase transitions and classifying phases of matter is central to
understanding the properties and behavior of a broad range of material systems.
In recent years, machine-learning (ML) techniques have been successfully
applied to perform such tasks in a data-driven manner. However, the success of
this approach notwithstanding, we still lack a clear understanding of ML
methods for detecting phase transitions, particularly of those that utilize
neural networks (NNs). In this work, we derive analytical expressions for the
optimal output of three widely used NN-based methods for detecting phase
transitions. These optimal predictions correspond to the results obtained in
the limit of high model capacity. Therefore, in practice they can, for example,
be recovered using sufficiently large, well-trained NNs. The inner workings of
the considered methods are revealed through the explicit dependence of the
optimal output on the input data. By evaluating the analytical expressions, we
can identify phase transitions directly from experimentally accessible data
without training NNs, which makes this procedure favorable in terms of
computation time. Our theoretical results are supported by extensive numerical
simulations covering, e.g., topological, quantum, and many-body localization
phase transitions. We expect similar analyses to provide a deeper understanding
of other classification tasks in condensed matter physics.
Related papers
- Detecting Quantum and Classical Phase Transitions via Unsupervised Machine Learning of the Fisher Information Metric [0.0]
We develop an unsupervised machine learning (ML) method called ClassiFIM.
We find that ClassiFIM reliably detects both topological (e.g., XXZ chain) and dynamical (e.g., metal-insulator transition in Hubbard model) quantum phase transitions.
arXiv Detail & Related papers (2024-08-06T19:34:04Z) - Characterizing out-of-distribution generalization of neural networks: application to the disordered Su-Schrieffer-Heeger model [38.79241114146971]
We show how interpretability methods can increase trust in predictions of a neural network trained to classify quantum phases.
In particular, we show that we can ensure better out-of-distribution generalization in the complex classification problem.
This work is an example of how the systematic use of interpretability methods can improve the performance of NNs in scientific problems.
arXiv Detail & Related papers (2024-06-14T13:24:32Z) - Diffusion Generative Flow Samplers: Improving learning signals through
partial trajectory optimization [87.21285093582446]
Diffusion Generative Flow Samplers (DGFS) is a sampling-based framework where the learning process can be tractably broken down into short partial trajectory segments.
Our method takes inspiration from the theory developed for generative flow networks (GFlowNets)
arXiv Detail & Related papers (2023-10-04T09:39:05Z) - MARS: Meta-Learning as Score Matching in the Function Space [79.73213540203389]
We present a novel approach to extracting inductive biases from a set of related datasets.
We use functional Bayesian neural network inference, which views the prior as a process and performs inference in the function space.
Our approach can seamlessly acquire and represent complex prior knowledge by metalearning the score function of the data-generating process.
arXiv Detail & Related papers (2022-10-24T15:14:26Z) - Observing a topological phase transition with deep neural networks from
experimental images of ultracold atoms [0.0]
We report a successful identification of topological phase transitions using a deep convolutional neural network trained with low signal-to-noise-ratio (SNR) experimental data.
Our work highlights the potential of machine learning techniques to be used in various quantum systems.
arXiv Detail & Related papers (2022-09-21T01:12:21Z) - Quantum-tailored machine-learning characterization of a superconducting
qubit [50.591267188664666]
We develop an approach to characterize the dynamics of a quantum device and learn device parameters.
This approach outperforms physics-agnostic recurrent neural networks trained on numerically generated and experimental data.
This demonstration shows how leveraging domain knowledge improves the accuracy and efficiency of this characterization task.
arXiv Detail & Related papers (2021-06-24T15:58:57Z) - Statistical Approach to Quantum Phase Estimation [62.92678804023415]
We introduce a new statistical and variational approach to the phase estimation algorithm (PEA)
Unlike the traditional and iterative PEAs which return only an eigenphase estimate, the proposed method can determine any unknown eigenstate-eigenphase pair.
We show the simulation results of the method with the Qiskit package on the IBM Q platform and on a local computer.
arXiv Detail & Related papers (2021-04-21T00:02:00Z) - Efficient Data-Dependent Learnability [8.766022970635898]
The predictive normalized maximum likelihood (pNML) approach has recently been proposed as the min-max optimal solution to the batch learning problem.
We show that when applied to neural networks, this approximation can detect out-of-distribution examples effectively.
arXiv Detail & Related papers (2020-11-20T10:44:55Z) - Understanding Learning Dynamics for Neural Machine Translation [53.23463279153577]
We propose to understand learning dynamics of NMT by using Loss Change Allocation (LCA)citeplan 2019-loss-change-allocation.
As LCA requires calculating the gradient on an entire dataset for each update, we instead present an approximate to put it into practice in NMT scenario.
Our simulated experiment shows that such approximate calculation is efficient and is empirically proved to deliver consistent results.
arXiv Detail & Related papers (2020-04-05T13:32:58Z) - Unsupervised machine learning of quantum phase transitions using
diffusion maps [77.34726150561087]
We show that the diffusion map method, which performs nonlinear dimensionality reduction and spectral clustering of the measurement data, has significant potential for learning complex phase transitions unsupervised.
This method works for measurements of local observables in a single basis and is thus readily applicable to many experimental quantum simulators.
arXiv Detail & Related papers (2020-03-16T18:40:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.