Matrix Product State Fixed Points of Non-Hermitian Transfer Matrices
- URL: http://arxiv.org/abs/2311.18733v1
- Date: Thu, 30 Nov 2023 17:28:30 GMT
- Title: Matrix Product State Fixed Points of Non-Hermitian Transfer Matrices
- Authors: Wei Tang, Frank Verstraete, Jutho Haegeman
- Abstract summary: We investigate the impact of gauge degrees of freedom in the virtual indices of the tensor network on the contraction process.
We show that the gauge transformation can affect the entanglement structures of the eigenstates of the transfer matrix.
- Score: 13.134551442994132
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The contraction of tensor networks is a central task in the application of
tensor network methods to the study of quantum and classical many body systems.
In this paper, we investigate the impact of gauge degrees of freedom in the
virtual indices of the tensor network on the contraction process, specifically
focusing on boundary matrix product state methods for contracting
two-dimensional tensor networks. We show that the gauge transformation can
affect the entanglement structures of the eigenstates of the transfer matrix
and change how the physical information is encoded in the eigenstates, which
can influence the accuracy of the numerical simulation. We demonstrate this
effect by looking at two different examples. First, we focus on the local gauge
transformation, and analyze its effect by viewing it as an imaginary-time
evolution governed by a diagonal Hamiltonian. As a specific example, we perform
a numerical analysis in the classical Ising model on the square lattice.
Second, we go beyond the scope of local gauge transformations and study the
antiferromagnetic Ising model on the triangular lattice. The partition function
of this model has two tensor network representations connected by a non-local
gauge transformation, resulting in distinct numerical performances in the
boundary MPS calculation.
Related papers
- Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - Tensor Network Representation and Entanglement Spreading in Many-Body
Localized Systems: A Novel Approach [0.0]
A novel method has been devised to compute the Local Integrals of Motion for a one-dimensional many-body localized system.
A class of optimal unitary transformations is deduced in a tensor-network formalism to diagonalize the Hamiltonian of the specified system.
The efficiency of the method was assessed and found to be both fast and almost accurate.
arXiv Detail & Related papers (2023-12-13T14:28:45Z) - Efficient Simulation of Dynamics in Two-Dimensional Quantum Spin Systems
with Isometric Tensor Networks [0.0]
We investigate the computational power of the recently introduced class of isometric tensor network states (isoTNSs)
We discuss several technical details regarding the implementation of isoTNSs-based algorithms and compare different disentanglers.
We compute the dynamical spin structure factor of 2D quantum spin systems for two paradigmatic models.
arXiv Detail & Related papers (2021-12-15T19:00:05Z) - Boundary theories of critical matchgate tensor networks [59.433172590351234]
Key aspects of the AdS/CFT correspondence can be captured in terms of tensor network models on hyperbolic lattices.
For tensors fulfilling the matchgate constraint, these have previously been shown to produce disordered boundary states.
We show that these Hamiltonians exhibit multi-scale quasiperiodic symmetries captured by an analytical toy model.
arXiv Detail & Related papers (2021-10-06T18:00:03Z) - Topographic VAEs learn Equivariant Capsules [84.33745072274942]
We introduce the Topographic VAE: a novel method for efficiently training deep generative models with topographically organized latent variables.
We show that such a model indeed learns to organize its activations according to salient characteristics such as digit class, width, and style on MNIST.
We demonstrate approximate equivariance to complex transformations, expanding upon the capabilities of existing group equivariant neural networks.
arXiv Detail & Related papers (2021-09-03T09:25:57Z) - Primal-Dual Mesh Convolutional Neural Networks [62.165239866312334]
We propose a primal-dual framework drawn from the graph-neural-network literature to triangle meshes.
Our method takes features for both edges and faces of a 3D mesh as input and dynamically aggregates them.
We provide theoretical insights of our approach using tools from the mesh-simplification literature.
arXiv Detail & Related papers (2020-10-23T14:49:02Z) - Numerical continuum tensor networks in two dimensions [0.0]
We numerically determine wave functions of interacting two-dimensional fermionic models in the continuum limit.
We use two different tensor network states: one based on the numerical continuum limit of fermionic projected entangled pair states obtained via a tensor network formulation of multi-grid.
We first benchmark our approach on the two-dimensional free Fermi gas then proceed to study the two-dimensional interacting Fermi gas with an attractive interaction in the unitary limit.
arXiv Detail & Related papers (2020-08-24T17:08:39Z) - Cylindrical Convolutional Networks for Joint Object Detection and
Viewpoint Estimation [76.21696417873311]
We introduce a learnable module, cylindrical convolutional networks (CCNs), that exploit cylindrical representation of a convolutional kernel defined in the 3D space.
CCNs extract a view-specific feature through a view-specific convolutional kernel to predict object category scores at each viewpoint.
Our experiments demonstrate the effectiveness of the cylindrical convolutional networks on joint object detection and viewpoint estimation.
arXiv Detail & Related papers (2020-03-25T10:24:58Z) - Efficient variational contraction of two-dimensional tensor networks
with a non-trivial unit cell [0.0]
tensor network states provide an efficient class of states that faithfully capture strongly correlated quantum models and systems.
We generalize a recently proposed variational uniform matrix product state algorithm for capturing one-dimensional quantum lattices.
A key property of the algorithm is a computational effort that scales linearly rather than exponentially in the size of the unit cell.
arXiv Detail & Related papers (2020-03-02T19:01:06Z) - Mean-field entanglement transitions in random tree tensor networks [0.0]
Entanglement phase transitions in quantum chaotic systems have emerged as a new class of critical points separating phases with different entanglement scaling.
We propose a mean-field theory of such transitions by studying the entanglement properties of random tree tensor networks.
arXiv Detail & Related papers (2020-03-02T19:00:19Z) - Inverse Learning of Symmetries [71.62109774068064]
We learn the symmetry transformation with a model consisting of two latent subspaces.
Our approach is based on the deep information bottleneck in combination with a continuous mutual information regulariser.
Our model outperforms state-of-the-art methods on artificial and molecular datasets.
arXiv Detail & Related papers (2020-02-07T13:48:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.