Reflected entropy in random tensor networks
- URL: http://arxiv.org/abs/2112.09122v3
- Date: Fri, 14 Oct 2022 18:09:40 GMT
- Title: Reflected entropy in random tensor networks
- Authors: Chris Akers, Thomas Faulkner, Simon Lin, Pratik Rath
- Abstract summary: In holographic theories, the reflected entropy has been shown to be dual to the area of the entanglement wedge cross section.
We analyze the important non-perturbative effects that smooth out the discontinuity in the reflected entropy across the Page phase transition.
By summing over all such effects, we obtain the reflected entanglement spectrum analytically, which agrees well with numerical studies.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In holographic theories, the reflected entropy has been shown to be dual to
the area of the entanglement wedge cross section. We study the same problem in
random tensor networks demonstrating an equivalent duality. For a single random
tensor we analyze the important non-perturbative effects that smooth out the
discontinuity in the reflected entropy across the Page phase transition. By
summing over all such effects, we obtain the reflected entanglement spectrum
analytically, which agrees well with numerical studies. This motivates a
prescription for the analytic continuation required in computing the reflected
entropy and its R\'enyi generalization which resolves an order of limits issue
previously identified in the literature. We apply this prescription to
hyperbolic tensor networks and find answers consistent with holographic
expectations. In particular, the random tensor network has the same non-trivial
tripartite entanglement structure expected from holographic states. We
furthermore show that the reflected R\'enyi spectrum is not flat, in sharp
contrast to the usual R\'enyi spectrum of these networks. We argue that the
various distinct contributions to the reflected entanglement spectrum can be
organized into approximate superselection sectors. We interpret this as
resulting from an effective description of the canonically purified state as a
superposition of distinct tensor network states. Each network is constructed by
doubling and gluing various candidate entanglement wedges of the original
network. The superselection sectors are labelled by the different
cross-sectional areas of these candidate entanglement wedges.
Related papers
- Spectral complexity of deep neural networks [2.099922236065961]
We use the angular power spectrum of the limiting field to characterize the complexity of the network architecture.
On this basis, we classify neural networks as low-disorder, sparse, or high-disorder.
We show how this classification highlights a number of distinct features for standard activation functions, and in particular, sparsity properties of ReLU networks.
arXiv Detail & Related papers (2024-05-15T17:55:05Z) - HoloNets: Spectral Convolutions do extend to Directed Graphs [59.851175771106625]
Conventional wisdom dictates that spectral convolutional networks may only be deployed on undirected graphs.
Here we show this traditional reliance on the graph Fourier transform to be superfluous.
We provide a frequency-response interpretation of newly developed filters, investigate the influence of the basis used to express filters and discuss the interplay with characteristic operators on which networks are based.
arXiv Detail & Related papers (2023-10-03T17:42:09Z) - Exploring Invariant Representation for Visible-Infrared Person
Re-Identification [77.06940947765406]
Cross-spectral person re-identification, which aims to associate identities to pedestrians across different spectra, faces a main challenge of the modality discrepancy.
In this paper, we address the problem from both image-level and feature-level in an end-to-end hybrid learning framework named robust feature mining network (RFM)
Experiment results on two standard cross-spectral person re-identification datasets, RegDB and SYSU-MM01, have demonstrated state-of-the-art performance.
arXiv Detail & Related papers (2023-02-02T05:24:50Z) - Reflected entropy in random tensor networks II: a topological index from
the canonical purification [0.0]
We show that the reflected entanglement spectrum is controlled by representation theory of the Temperley-Lieb algebra.
We provide a gravitational interpretation in terms of fixed-area, higher-genus multiboundary wormholes with genus $2k-1$ initial value slices.
arXiv Detail & Related papers (2022-10-26T20:03:29Z) - Random tensor networks with nontrivial links [1.9440833697222828]
We initiate a systematic study of the entanglement properties of random tensor networks.
We employ tools from free probability, random matrix theory, and one-shot quantum information theory.
We draw connections to previous work on split transfer protocols, entanglement negativity in random tensor networks, and Euclidean path integrals in quantum gravity.
arXiv Detail & Related papers (2022-06-21T15:49:29Z) - Entangled Residual Mappings [59.02488598557491]
We introduce entangled residual mappings to generalize the structure of the residual connections.
An entangled residual mapping replaces the identity skip connections with specialized entangled mappings.
We show that while entangled mappings can preserve the iterative refinement of features across various deep models, they influence the representation learning process in convolutional networks.
arXiv Detail & Related papers (2022-06-02T19:36:03Z) - Spectral embedding and the latent geometry of multipartite networks [67.56499794542228]
Many networks are multipartite, meaning their nodes can be divided into partitions and nodes of the same partition are never connected.
This paper demonstrates that the node representations obtained via spectral embedding live near partition-specific low-dimensional subspaces of a higher-dimensional ambient space.
We propose a follow-on step after spectral embedding, to recover node representations in their intrinsic rather than ambient dimension.
arXiv Detail & Related papers (2022-02-08T15:52:03Z) - Mean-field Analysis of Piecewise Linear Solutions for Wide ReLU Networks [83.58049517083138]
We consider a two-layer ReLU network trained via gradient descent.
We show that SGD is biased towards a simple solution.
We also provide empirical evidence that knots at locations distinct from the data points might occur.
arXiv Detail & Related papers (2021-11-03T15:14:20Z) - Batch Normalization Orthogonalizes Representations in Deep Random
Networks [3.109481609083199]
We establish a non-asymptotic characterization of the interplay between depth, width, and the orthogonality of deep representations.
We prove that the deviation of the representations from orthogonality rapidly decays with depth up to a term inversely proportional to the network width.
This result has two main implications: 1) Theoretically, as the depth grows, the distribution of the representation contracts to a Wasserstein-2 ball around an isotropic Gaussian distribution.
arXiv Detail & Related papers (2021-06-07T21:14:59Z) - Quantum particle across Grushin singularity [77.34726150561087]
We study the phenomenon of transmission across the singularity that separates the two half-cylinders.
All the local realisations of the free (Laplace-Beltrami) quantum Hamiltonian are examined as non-equivalent protocols of transmission/reflection.
This allows to comprehend the distinguished status of the so-called bridging' transmission protocol previously identified in the literature.
arXiv Detail & Related papers (2020-11-27T12:53:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.