Correlation-Enhanced Neural Networks as Interpretable Variational
Quantum States
- URL: http://arxiv.org/abs/2103.05017v1
- Date: Mon, 8 Mar 2021 19:01:12 GMT
- Title: Correlation-Enhanced Neural Networks as Interpretable Variational
Quantum States
- Authors: Agnes Valenti, Eliska Greplova, Netanel H. Lindner and Sebastian D.
Huber
- Abstract summary: Variational methods have proven to be excellent tools to approximate ground states of complex many body Hamiltonians.
We introduce a neural-network based variational ansatz that retains the flexibility of these generic methods while allowing for a tunability with respect to the relevant correlations governing the physics of the system.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Variational methods have proven to be excellent tools to approximate ground
states of complex many body Hamiltonians. Generic tools like neural networks
are extremely powerful, but their parameters are not necessarily physically
motivated. Thus, an efficient parametrization of the wave-function can become
challenging. In this letter we introduce a neural-network based variational
ansatz that retains the flexibility of these generic methods while allowing for
a tunability with respect to the relevant correlations governing the physics of
the system. We illustrate the success of this approach on topological,
long-range correlated and frustrated models. Additionally, we introduce
compatible variational optimization methods for exploration of low-lying
excited states without symmetries that preserve the interpretability of the
ansatz.
Related papers
- Efficiency of the hidden fermion determinant states Ansatz in the light of different complexity measures [0.0]
Ans"atze utilizes the expressivity of neural networks to tackle fundamentally challenging problems.
We study five different fermionic models displaying volume law scaling of the entanglement entropy.
We provide evidence that whenever one of the measures indicates proximity to a parameter region in which a conventional approach would work reliable, the neural network approach also works reliable and efficient.
arXiv Detail & Related papers (2024-11-07T08:36:37Z) - Neural Quantum States in Variational Monte Carlo Method: A Brief Summary [0.0]
variational Monte Carlo method based on neural quantum states for spin systems is reviewed.
neural networks can represent relatively complex wave functions with relatively small computational resources.
In quantum-state tomography, the representation method of neural quantum states has already achieved significant results.
arXiv Detail & Related papers (2024-06-03T05:55:55Z) - Deep Neural Networks as Variational Solutions for Correlated Open
Quantum Systems [0.0]
We show that parametrizing the density matrix directly with more powerful models can yield better variational ansatz functions.
We present results for the dissipative one-dimensional transverse-field Ising model and a two-dimensional dissipative Heisenberg model.
arXiv Detail & Related papers (2024-01-25T13:41:34Z) - DiffHybrid-UQ: Uncertainty Quantification for Differentiable Hybrid
Neural Modeling [4.76185521514135]
We introduce a novel method, DiffHybrid-UQ, for effective and efficient uncertainty propagation and estimation in hybrid neural differentiable models.
Specifically, our approach effectively discerns and quantifies both aleatoric uncertainties, arising from data noise, and epistemic uncertainties, resulting from model-form discrepancies and data sparsity.
arXiv Detail & Related papers (2023-12-30T07:40:47Z) - Learning minimal representations of stochastic processes with
variational autoencoders [52.99137594502433]
We introduce an unsupervised machine learning approach to determine the minimal set of parameters required to describe a process.
Our approach enables for the autonomous discovery of unknown parameters describing processes.
arXiv Detail & Related papers (2023-07-21T14:25:06Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Learning Physical Dynamics with Subequivariant Graph Neural Networks [99.41677381754678]
Graph Neural Networks (GNNs) have become a prevailing tool for learning physical dynamics.
Physical laws abide by symmetry, which is a vital inductive bias accounting for model generalization.
Our model achieves on average over 3% enhancement in contact prediction accuracy across 8 scenarios on Physion and 2X lower rollout MSE on RigidFall.
arXiv Detail & Related papers (2022-10-13T10:00:30Z) - Equivariant Graph Mechanics Networks with Constraints [83.38709956935095]
We propose Graph Mechanics Network (GMN) which is efficient, equivariant and constraint-aware.
GMN represents, by generalized coordinates, the forward kinematics information (positions and velocities) of a structural object.
Extensive experiments support the advantages of GMN compared to the state-of-the-art GNNs in terms of prediction accuracy, constraint satisfaction and data efficiency.
arXiv Detail & Related papers (2022-03-12T14:22:14Z) - Decimation technique for open quantum systems: a case study with
driven-dissipative bosonic chains [62.997667081978825]
Unavoidable coupling of quantum systems to external degrees of freedom leads to dissipative (non-unitary) dynamics.
We introduce a method to deal with these systems based on the calculation of (dissipative) lattice Green's function.
We illustrate the power of this method with several examples of driven-dissipative bosonic chains of increasing complexity.
arXiv Detail & Related papers (2022-02-15T19:00:09Z) - Post-mortem on a deep learning contest: a Simpson's paradox and the
complementary roles of scale metrics versus shape metrics [61.49826776409194]
We analyze a corpus of models made publicly-available for a contest to predict the generalization accuracy of neural network (NN) models.
We identify what amounts to a Simpson's paradox: where "scale" metrics perform well overall but perform poorly on sub partitions of the data.
We present two novel shape metrics, one data-independent, and the other data-dependent, which can predict trends in the test accuracy of a series of NNs.
arXiv Detail & Related papers (2021-06-01T19:19:49Z) - Recurrent Neural Network Wave Functions [0.36748639131154304]
A core technology that has emerged from the artificial intelligence revolution is the recurrent neural network (RNN)
We demonstrate the effectiveness of RNN wave functions by calculating ground state energies, correlation functions, and entanglement entropies for several quantum spin models of interest to condensed matter physicists.
arXiv Detail & Related papers (2020-02-07T19:00:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.