Deep Machine Learning Reconstructing Lattice Topology with Strong
Thermal Fluctuations
- URL: http://arxiv.org/abs/2208.04119v1
- Date: Mon, 8 Aug 2022 13:28:21 GMT
- Title: Deep Machine Learning Reconstructing Lattice Topology with Strong
Thermal Fluctuations
- Authors: Xiao-Han Wang, Pei Shi, Bin Xi, Jie Hu, and Shi-Ju Ran
- Abstract summary: Deep convolutional neural network (CNN) reconstructed lattice topology in presence of strong thermal fluctuations and unbalanced data.
Our scheme distinguishes from the previous ones that might require the knowledge on the node dynamics.
We unveil the generalization of CNN on dealing with the instances evolved from the unlearnt initial spin configurations.
- Score: 10.55785051508181
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Applying artificial intelligence to scientific problems (namely AI for
science) is currently under hot debate. However, the scientific problems differ
much from the conventional ones with images, texts, and etc., where new
challenges emerges with the unbalanced scientific data and complicated effects
from the physical setups. In this work, we demonstrate the validity of the deep
convolutional neural network (CNN) on reconstructing the lattice topology
(i.e., spin connectivities) in the presence of strong thermal fluctuations and
unbalanced data. Taking the kinetic Ising model with Glauber dynamics as an
example, the CNN maps the time-dependent local magnetic momenta (a single-node
feature) evolved from a specific initial configuration (dubbed as an evolution
instance) to the probabilities of the presences of the possible couplings. Our
scheme distinguishes from the previous ones that might require the knowledge on
the node dynamics, the responses from perturbations, or the evaluations of
statistic quantities such as correlations or transfer entropy from many
evolution instances. The fine tuning avoids the "barren plateau" caused by the
strong thermal fluctuations at high temperatures. Accurate reconstructions can
be made where the thermal fluctuations dominate over the correlations and
consequently the statistic methods in general fail. Meanwhile, we unveil the
generalization of CNN on dealing with the instances evolved from the unlearnt
initial spin configurations and those with the unlearnt lattices. We raise an
open question on the learning with unbalanced data in the nearly
"double-exponentially" large sample space.
Related papers
- Observation of anomalous information scrambling in a Rydberg atom array [5.591432092887684]
Quantum information scrambling describes the propagation and effective loss of local information.
Here, we report the experimental observation of anomalous information scrambling in an atomic tweezer array.
arXiv Detail & Related papers (2024-10-21T16:40:25Z) - Causal Representation Learning in Temporal Data via Single-Parent Decoding [66.34294989334728]
Scientific research often seeks to understand the causal structure underlying high-level variables in a system.
Scientists typically collect low-level measurements, such as geographically distributed temperature readings.
We propose a differentiable method, Causal Discovery with Single-parent Decoding, that simultaneously learns the underlying latents and a causal graph over them.
arXiv Detail & Related papers (2024-10-09T15:57:50Z) - Advection Augmented Convolutional Neural Networks [6.805997961535213]
We introduce a physically inspired architecture for the solution of such problems.
We show that the proposed operator allows for the non-local transformation of information.
We then complement it with Reaction and Diffusion neural components to form a network that mimics the Reaction-Advection-Diffusion network.
arXiv Detail & Related papers (2024-06-27T15:22:21Z) - Assessing Neural Network Representations During Training Using
Noise-Resilient Diffusion Spectral Entropy [55.014926694758195]
Entropy and mutual information in neural networks provide rich information on the learning process.
We leverage data geometry to access the underlying manifold and reliably compute these information-theoretic measures.
We show that they form noise-resistant measures of intrinsic dimensionality and relationship strength in high-dimensional simulated data.
arXiv Detail & Related papers (2023-12-04T01:32:42Z) - Liouvillian Dynamics of the Open Schwinger Model: String Breaking and Kinetic Dissipation in a Thermal Medium [0.0]
We consider the string-breaking dynamics within the Schwinger model and investigate its modification inside a thermal medium.
We analyze the Liouvillian gaps of a Lindblad equation and the time dependence of the system's von Neumann entropy.
We discuss how the Liouvillian dynamics of the open Schwinger model can be simulated on quantum computers.
arXiv Detail & Related papers (2023-08-07T19:15:55Z) - Machine learning in and out of equilibrium [58.88325379746631]
Our study uses a Fokker-Planck approach, adapted from statistical physics, to explore these parallels.
We focus in particular on the stationary state of the system in the long-time limit, which in conventional SGD is out of equilibrium.
We propose a new variation of Langevin dynamics (SGLD) that harnesses without replacement minibatching.
arXiv Detail & Related papers (2023-06-06T09:12:49Z) - Emergent pair localization in a many-body quantum spin system [0.0]
Generically, non-integrable quantum systems are expected to thermalize as they comply with the Eigenstate Thermalization Hypothesis.
In the presence of strong disorder, the dynamics can possibly slow down to a degree that systems fail to thermalize on experimentally accessible timescales.
We study an ensemble of Heisenberg spins with a tunable distribution of random coupling strengths realized by a Rydberg quantum simulator.
arXiv Detail & Related papers (2022-07-28T16:31:18Z) - Multi-scale Feature Learning Dynamics: Insights for Double Descent [71.91871020059857]
We study the phenomenon of "double descent" of the generalization error.
We find that double descent can be attributed to distinct features being learned at different scales.
arXiv Detail & Related papers (2021-12-06T18:17:08Z) - Learning neural network potentials from experimental data via
Differentiable Trajectory Reweighting [0.0]
Top-down approaches that learn neural network (NN) potentials directly from experimental data have received less attention.
We present the Differentiable Trajectory Reweighting (DiffTRe) method, which bypasses differentiation through the MD simulation for time-independent observables.
We show effectiveness of DiffTRe in learning NN potentials for an atomistic model of diamond and a coarse-grained model of water based on diverse experimental observables.
arXiv Detail & Related papers (2021-06-02T13:10:43Z) - Probing eigenstate thermalization in quantum simulators via
fluctuation-dissipation relations [77.34726150561087]
The eigenstate thermalization hypothesis (ETH) offers a universal mechanism for the approach to equilibrium of closed quantum many-body systems.
Here, we propose a theory-independent route to probe the full ETH in quantum simulators by observing the emergence of fluctuation-dissipation relations.
Our work presents a theory-independent way to characterize thermalization in quantum simulators and paves the way to quantum simulate condensed matter pump-probe experiments.
arXiv Detail & Related papers (2020-07-20T18:00:02Z) - Parsimonious neural networks learn interpretable physical laws [77.34726150561087]
We propose parsimonious neural networks (PNNs) that combine neural networks with evolutionary optimization to find models that balance accuracy with parsimony.
The power and versatility of the approach is demonstrated by developing models for classical mechanics and to predict the melting temperature of materials from fundamental properties.
arXiv Detail & Related papers (2020-05-08T16:15:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.