Unsupervised and supervised learning of interacting topological phases
from single-particle correlation functions
- URL: http://arxiv.org/abs/2202.09281v1
- Date: Fri, 18 Feb 2022 16:02:29 GMT
- Title: Unsupervised and supervised learning of interacting topological phases
from single-particle correlation functions
- Authors: Simone Tibaldi, Giuseppe Magnifico, Davide Vodola, Elisa Ercolessi
- Abstract summary: We show that unsupervised and supervised machine learning techniques are able to predict phases of a non-exactly solvable model when trained on data of a solvable model.
In particular, we employ a training set made by single-particle correlation functions of a non-interacting quantum wire.
We show that both the principal component analysis and the convolutional neural networks trained on the data of the non-interacting model can identify the topological phases of the interacting model with a high degree of accuracy.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The recent advances in machine learning algorithms have boosted the
application of these techniques to the field of condensed matter physics, in
order e.g. to classify the phases of matter at equilibrium or to predict the
real-time dynamics of a large class of physical models. Typically in these
works, a machine learning algorithm is trained and tested on data coming from
the same physical model. Here we demonstrate that unsupervised and supervised
machine learning techniques are able to predict phases of a non-exactly
solvable model when trained on data of a solvable model. In particular, we
employ a training set made by single-particle correlation functions of a
non-interacting quantum wire and by using principal component analysis, k-means
clustering, and convolutional neural networks we reconstruct the phase diagram
of an interacting superconductor. We show that both the principal component
analysis and the convolutional neural networks trained on the data of the
non-interacting model can identify the topological phases of the interacting
model with a high degree of accuracy. Our findings indicate that non-trivial
phases of matter emerging from the presence of interactions can be identified
by means of unsupervised and supervised techniques applied to data of
non-interacting systems.
Related papers
- A Comparative Study of Machine Learning Models Predicting Energetics of Interacting Defects [5.574191640970887]
We present a comparative study of three different methods to predict the free energy change of systems with interacting defects.
Our findings indicate that the cluster expansion model can achieve precise energetics predictions even with this limited dataset.
This research provide a preliminary evaluation of applying machine learning techniques in imperfect surface systems.
arXiv Detail & Related papers (2024-03-20T02:15:48Z) - In-Context Convergence of Transformers [63.04956160537308]
We study the learning dynamics of a one-layer transformer with softmax attention trained via gradient descent.
For data with imbalanced features, we show that the learning dynamics take a stage-wise convergence process.
arXiv Detail & Related papers (2023-10-08T17:55:33Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Stretched and measured neural predictions of complex network dynamics [2.1024950052120417]
Data-driven approximations of differential equations present a promising alternative to traditional methods for uncovering a model of dynamical systems.
A recently employed machine learning tool for studying dynamics is neural networks, which can be used for data-driven solution finding or discovery of differential equations.
We show that extending the model's generalizability beyond traditional statistical learning theory limits is feasible.
arXiv Detail & Related papers (2023-01-12T09:44:59Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Leveraging the structure of dynamical systems for data-driven modeling [111.45324708884813]
We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
arXiv Detail & Related papers (2021-12-15T20:09:20Z) - Using scientific machine learning for experimental bifurcation analysis
of dynamic systems [2.204918347869259]
This study focuses on training universal differential equation (UDE) models for physical nonlinear dynamical systems with limit cycles.
We consider examples where training data is generated by numerical simulations, whereas we also employ the proposed modelling concept to physical experiments.
We use both neural networks and Gaussian processes as universal approximators alongside the mechanistic models to give a critical assessment of the accuracy and robustness of the UDE modelling approach.
arXiv Detail & Related papers (2021-10-22T15:43:03Z) - Learning Neural Causal Models with Active Interventions [83.44636110899742]
We introduce an active intervention-targeting mechanism which enables a quick identification of the underlying causal structure of the data-generating process.
Our method significantly reduces the required number of interactions compared with random intervention targeting.
We demonstrate superior performance on multiple benchmarks from simulated to real-world data.
arXiv Detail & Related papers (2021-09-06T13:10:37Z) - Unsupervised machine learning of topological phase transitions from
experimental data [52.77024349608834]
We apply unsupervised machine learning techniques to experimental data from ultracold atoms.
We obtain the topological phase diagram of the Haldane model in a completely unbiased fashion.
Our work provides a benchmark for unsupervised detection of new exotic phases in complex many-body systems.
arXiv Detail & Related papers (2021-01-14T16:38:21Z) - The Role of Isomorphism Classes in Multi-Relational Datasets [6.419762264544509]
We show that isomorphism leakage overestimates performance in multi-relational inference.
We propose isomorphism-aware synthetic benchmarks for model evaluation.
We also demonstrate that isomorphism classes can be utilised through a simple prioritisation scheme.
arXiv Detail & Related papers (2020-09-30T12:15:24Z) - Watch and learn -- a generalized approach for transferrable learning in
deep neural networks via physical principles [0.0]
We demonstrate an unsupervised learning approach that achieves fully transferrable learning for problems in statistical physics across different physical regimes.
By coupling a sequence model based on a recurrent neural network to an extensive deep neural network, we are able to learn the equilibrium probability distributions and inter-particle interaction models of classical statistical mechanical systems.
arXiv Detail & Related papers (2020-03-03T18:37:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.