Hierarchical autoregressive neural networks in three-dimensional statistical system
- URL: http://arxiv.org/abs/2503.08610v1
- Date: Tue, 11 Mar 2025 16:51:01 GMT
- Title: Hierarchical autoregressive neural networks in three-dimensional statistical system
- Authors: Piotr Białas, Vaibhav Chahar, Piotr Korcyl, Tomasz Stebel, Mateusz Winiarski, Dawid Zapolski,
- Abstract summary: Autoregressive Neural Networks (ANN) have been recently proposed as a mechanism to improve the efficiency of Monte Carlo algorithms for several spin systems.<n>In this paper, we describe a generalization of the hierarchical algorithm to three spatial dimensions and study its performance on the example of the Ising model.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Autoregressive Neural Networks (ANN) have been recently proposed as a mechanism to improve the efficiency of Monte Carlo algorithms for several spin systems. The idea relies on the fact that the total probability of a configuration can be factorized into conditional probabilities of each spin, which in turn can be approximated by a neural network. Once trained, the ANNs can be used to sample configurations from the approximated probability distribution and to evaluate explicitly this probability for a given configuration. It has also been observed that such conditional probabilities give access to information-theoretic observables such as mutual information or entanglement entropy. So far, these methods have been applied to two-dimensional statistical systems or one-dimensional quantum systems. In this paper, we describe a generalization of the hierarchical algorithm to three spatial dimensions and study its performance on the example of the Ising model. We discuss the efficiency of the training and also describe the scaling with the system's dimensionality by comparing results for two- and three-dimensional Ising models with the same number of spins. Finally, we provide estimates of thermodynamical observables for the three-dimensional Ising model, such as the entropy and free energy in a range of temperatures across the phase transition.
Related papers
- Dynamics of disordered quantum systems with two- and three-dimensional tensor networks [0.0]
We show how two- and three-dimensional tensor networks can accurately and efficiently simulate the quantum annealing dynamics of Ising spin glasses on a range of lattices.<n>Our results demonstrate that tensor networks are a viable approach for simulating large scale quantum dynamics in two and three dimensions on classical computers.
arXiv Detail & Related papers (2025-03-07T18:58:03Z) - Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces novel deep dynamical models designed to represent continuous-time sequences.<n>We train the model using maximum likelihood estimation with Markov chain Monte Carlo.<n> Experimental results on oscillating systems, videos and real-world state sequences (MuJoCo) demonstrate that our model with the learnable energy-based prior outperforms existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - Bayesian Circular Regression with von Mises Quasi-Processes [57.88921637944379]
In this work we explore a family of expressive and interpretable distributions over circle-valued random functions.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Gibbs sampling.
We present experiments applying this model to the prediction of wind directions and the percentage of the running gait cycle as a function of joint angles.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Rényi entanglement entropy of spin chain with Generative Neural Networks [0.0]
We describe a method to estimate R'enyi entanglement entropy of a spin system.
It is based on the replica trick and generative neural networks with explicit probability estimation.
We demonstrate our method on a one-dimensional quantum Ising spin chain.
arXiv Detail & Related papers (2024-06-10T11:44:54Z) - Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - Assessing Neural Network Representations During Training Using
Noise-Resilient Diffusion Spectral Entropy [55.014926694758195]
Entropy and mutual information in neural networks provide rich information on the learning process.
We leverage data geometry to access the underlying manifold and reliably compute these information-theoretic measures.
We show that they form noise-resistant measures of intrinsic dimensionality and relationship strength in high-dimensional simulated data.
arXiv Detail & Related papers (2023-12-04T01:32:42Z) - Sampling-Free Probabilistic Deep State-Space Models [28.221200872943825]
A Probabilistic Deep SSM generalizes to dynamical systems of unknown parametric form.
We propose the first deterministic inference algorithm for models of this type.
arXiv Detail & Related papers (2023-09-15T09:06:23Z) - Message-Passing Neural Quantum States for the Homogeneous Electron Gas [41.94295877935867]
We introduce a message-passing-neural-network-based wave function Ansatz to simulate extended, strongly interacting fermions in continuous space.
We demonstrate its accuracy by simulating the ground state of the homogeneous electron gas in three spatial dimensions.
arXiv Detail & Related papers (2023-05-12T04:12:04Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Deep learning of spatial densities in inhomogeneous correlated quantum
systems [0.0]
We show that we can learn to predict densities using convolutional neural networks trained on random potentials.
We show that our approach can handle well the interplay of interference and interactions and the behaviour of models with phase transitions in inhomogeneous situations.
arXiv Detail & Related papers (2022-11-16T17:10:07Z) - Identification of quantum entanglement with Siamese convolutional neural networks and semi-supervised learning [0.0]
Quantum entanglement is a fundamental property commonly used in various quantum information protocols and algorithms.
In this study, we use deep convolutional NNs, a type of supervised machine learning, to identify quantum entanglement for any bi Partition in a 3-qubit system.
arXiv Detail & Related papers (2022-10-13T23:17:55Z) - Neural-Network Quantum States for Periodic Systems in Continuous Space [66.03977113919439]
We introduce a family of neural quantum states for the simulation of strongly interacting systems in the presence of periodicity.
For one-dimensional systems we find very precise estimations of the ground-state energies and the radial distribution functions of the particles.
In two dimensions we obtain good estimations of the ground-state energies, comparable to results obtained from more conventional methods.
arXiv Detail & Related papers (2021-12-22T15:27:30Z) - Assessments of model-form uncertainty using Gaussian stochastic weight
averaging for fluid-flow regression [0.0]
We use Gaussian weight averaging (SWAG) to assess the model-form uncertainty associated with neural-network-based function approximation relevant to fluid flows.
SWAG approximates a posterior Gaussian distribution of each weight, given training data, and a constant learning rate.
We demonstrate the applicability of the method for two types of neural networks.
arXiv Detail & Related papers (2021-09-16T23:13:26Z) - State preparation and measurement in a quantum simulation of the O(3)
sigma model [65.01359242860215]
We show that fixed points of the non-linear O(3) sigma model can be reproduced near a quantum phase transition of a spin model with just two qubits per lattice site.
We apply Trotter methods to obtain results for the complexity of adiabatic ground state preparation in both the weak-coupling and quantum-critical regimes.
We present and analyze a quantum algorithm based on non-unitary randomized simulation methods.
arXiv Detail & Related papers (2020-06-28T23:44:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.