Investigating Network Parameters in Neural-Network Quantum States
- URL: http://arxiv.org/abs/2202.01704v3
- Date: Mon, 11 Apr 2022 04:25:01 GMT
- Title: Investigating Network Parameters in Neural-Network Quantum States
- Authors: Yusuke Nomura
- Abstract summary: Quantum-state representation using artificial neural networks has started to be recognized as a powerful tool.
We apply one of the simplest neural networks, the restricted Boltzmann machine (RBM), to the ground-state representation of the one-dimensional (1D) transverse-field Ising (TFI) model.
We find that the quantum phase transition from the ordered phase to the disordered phase in the 1D TFI model by increasing the transverse field is clearly reflected in the behaviors of the neural-network parameters.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, quantum-state representation using artificial neural networks has
started to be recognized as a powerful tool. However, due to the black-box
nature of machine learning, it is difficult to analyze what machine learns or
why it is powerful. Here, by applying one of the simplest neural networks, the
restricted Boltzmann machine (RBM), to the ground-state representation of the
one-dimensional (1D) transverse-field Ising (TFI) model, we make an attempt to
directly analyze the optimized network parameters. In the RBM optimization, a
zero-temperature quantum state is mapped onto a finite-temperature classical
state of the extended Ising spins that constitute the RBM. We find that the
quantum phase transition from the ordered phase to the disordered phase in the
1D TFI model by increasing the transverse field is clearly reflected in the
behaviors of the optimized RBM parameters and hence in the finite-temperature
phase diagram of the classical RBM Ising system. The present finding of a
correspondence between the neural-network parameters and quantum phases
suggests that a careful investigation of the neural-network parameters may
provide a new route to extracting nontrivial physical insights from the
neural-network wave functions.
Related papers
- Deep-Unrolling Multidimensional Harmonic Retrieval Algorithms on Neuromorphic Hardware [78.17783007774295]
This paper explores the potential of conversion-based neuromorphic algorithms for highly accurate and energy-efficient single-snapshot multidimensional harmonic retrieval.
A novel method for converting the complex-valued convolutional layers and activations into spiking neural networks (SNNs) is developed.
The converted SNNs achieve almost five-fold power efficiency at moderate performance loss compared to the original CNNs.
arXiv Detail & Related papers (2024-12-05T09:41:33Z) - Residual resampling-based physics-informed neural network for neutron diffusion equations [7.105073499157097]
The neutron diffusion equation plays a pivotal role in the analysis of nuclear reactors.
Traditional PINN approaches often utilize fully connected network (FCN) architecture.
R2-PINN effectively overcomes the limitations inherent in current methods, providing more accurate and robust solutions for neutron diffusion equations.
arXiv Detail & Related papers (2024-06-23T13:49:31Z) - Paths towards time evolution with larger neural-network quantum states [17.826631514127012]
We consider a quantum quench from the paramagnetic to the anti-ferromagnetic phase in the tilted Ising model.
We show that for both types of networks, the projected time-dependent variational Monte Carlo (p-tVMC) method performs better than the non-projected approach.
arXiv Detail & Related papers (2024-06-05T15:32:38Z) - Neural network approach to quasiparticle dispersions in doped
antiferromagnets [0.0]
We study the ability of neural quantum states to represent the bosonic and fermionic $t-J$ model on different 1D and 2D lattices.
We present a method to calculate dispersion relations from the neural network state representation.
arXiv Detail & Related papers (2023-10-12T17:59:33Z) - Spiking neural network for nonlinear regression [68.8204255655161]
Spiking neural networks carry the potential for a massive reduction in memory and energy consumption.
They introduce temporal and neuronal sparsity, which can be exploited by next-generation neuromorphic hardware.
A framework for regression using spiking neural networks is proposed.
arXiv Detail & Related papers (2022-10-06T13:04:45Z) - Neural network enhanced measurement efficiency for molecular
groundstates [63.36515347329037]
We adapt common neural network models to learn complex groundstate wavefunctions for several molecular qubit Hamiltonians.
We find that using a neural network model provides a robust improvement over using single-copy measurement outcomes alone to reconstruct observables.
arXiv Detail & Related papers (2022-06-30T17:45:05Z) - Variational learning of quantum ground states on spiking neuromorphic
hardware [0.0]
High-dimensional sampling spaces and transient autocorrelations confront neural networks with a challenging computational bottleneck.
Compared to conventional neural networks, physical-model devices offer a fast, efficient and inherently parallel substrate.
We demonstrate the ability of a neuromorphic chip to represent the ground states of quantum spin models by variational energy minimization.
arXiv Detail & Related papers (2021-09-30T14:39:45Z) - Conditionally Parameterized, Discretization-Aware Neural Networks for
Mesh-Based Modeling of Physical Systems [0.0]
We generalize the idea of conditional parametrization -- using trainable functions of input parameters.
We show that conditionally parameterized networks provide superior performance compared to their traditional counterparts.
A network architecture named CP-GNet is also proposed as the first deep learning model capable of reacting standalone prediction of flows on meshes.
arXiv Detail & Related papers (2021-09-15T20:21:13Z) - Quantum-tailored machine-learning characterization of a superconducting
qubit [50.591267188664666]
We develop an approach to characterize the dynamics of a quantum device and learn device parameters.
This approach outperforms physics-agnostic recurrent neural networks trained on numerically generated and experimental data.
This demonstration shows how leveraging domain knowledge improves the accuracy and efficiency of this characterization task.
arXiv Detail & Related papers (2021-06-24T15:58:57Z) - Modeling from Features: a Mean-field Framework for Over-parameterized
Deep Neural Networks [54.27962244835622]
This paper proposes a new mean-field framework for over- parameterized deep neural networks (DNNs)
In this framework, a DNN is represented by probability measures and functions over its features in the continuous limit.
We illustrate the framework via the standard DNN and the Residual Network (Res-Net) architectures.
arXiv Detail & Related papers (2020-07-03T01:37:16Z) - Phase Detection with Neural Networks: Interpreting the Black Box [58.720142291102135]
Neural networks (NNs) usually hinder any insight into the reasoning behind their predictions.
We demonstrate how influence functions can unravel the black box of NN when trained to predict the phases of the one-dimensional extended spinless Fermi-Hubbard model at half-filling.
arXiv Detail & Related papers (2020-04-09T17:45:45Z) - Flexible Transmitter Network [84.90891046882213]
Current neural networks are mostly built upon the MP model, which usually formulates the neuron as executing an activation function on the real-valued weighted aggregation of signals received from other neurons.
We propose the Flexible Transmitter (FT) model, a novel bio-plausible neuron model with flexible synaptic plasticity.
We present the Flexible Transmitter Network (FTNet), which is built on the most common fully-connected feed-forward architecture.
arXiv Detail & Related papers (2020-04-08T06:55:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.