Holographic Tensor Networks as Tessellations of Geometry
- URL: http://arxiv.org/abs/2512.19452v2
- Date: Wed, 24 Dec 2025 13:46:49 GMT
- Title: Holographic Tensor Networks as Tessellations of Geometry
- Authors: Qiang Wen, Mingshuai Xu, Haocheng Zhong,
- Abstract summary: Holographic tensor networks serve as toy models for the Anti-de Sitter/Conformal Field Theory (AdS/CFT) correspondence.<n>We develop two holographic tensor network models: the factorized PEE tensor network, and the random PEE tensor network.<n>In both models we reproduce the exact Ryu-Takayanagi formula by showing that the minimal number of cuts along a surface in the network exactly computes the area of this surface.
- Score: 3.3718367234744924
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Holographic tensor networks serve as toy models for the Anti-de Sitter/Conformal Field Theory (AdS/CFT) correspondence, capturing many of its essential features in a concrete manner. However, existing holographic tensor network models remain far from a complete theory of quantum gravity. A key obstacle is their discrete structure, which only approximates the semi-classical geometry of gravity in a qualitative sense. In \cite{Lin:2024dho}, it was shown that a network of partial-entanglement-entropy (PEE) threads, which are bulk geodesics with a specific density distribution, generates a perfect tessellation of AdS space. Moreover, such PEE-network tessellations can be constructed for more general geometries using the Crofton formula. In this paper, we assign a quantum state to each vertex in the PEE network and develop two holographic tensor network models: the factorized PEE tensor network, which takes the form of a tensor product of EPR pairs, and the random PEE tensor network. In both models we reproduce the exact Ryu-Takayanagi formula by showing that the minimal number of cuts along a homologous surface in the network exactly computes the area of this surface.
Related papers
- The Inductive Bias of Convolutional Neural Networks: Locality and Weight Sharing Reshape Implicit Regularization [57.37943479039033]
We study how architectural inductive bias reshapes the implicit regularization induced by the edge-of-stability phenomenon in gradient descent.<n>We show that locality and weight sharing fundamentally change this picture.
arXiv Detail & Related papers (2026-03-05T04:50:51Z) - Emergent statistical mechanics in holographic random tensor networks [41.99844472131922]
We show that RTN states equilibrate at large bond dimension and also in the scaling limit for three classes of geometries.<n>We reproduce a holographic degree-of-freedom counting for the effective dimension of each system.<n>These results demonstrate that RTN techniques can probe aspects of late-time dynamics of quantum many-body phases.
arXiv Detail & Related papers (2025-08-22T17:49:49Z) - Angular $k$-uniformity and the Hyperinvariance of Holographic Codes [1.0878040851638]
Holographic quantum error-correcting codes have emerged as compelling toy models for exploring bulk-boundary duality in AdS-CFT.<n>We introduce a geometric criterion called angular k-uniformity, which refines standard k-uniformity and its planar variants.<n>This condition enables the systematic identification and construction of hyperinvariant holographic codes on regular hyperbolic honeycombs in arbitrary dimension.
arXiv Detail & Related papers (2025-06-06T23:08:13Z) - From $SU(2)$ holonomies to holographic duality via tensor networks [0.0]
We construct a tensor network representation of the spin network states, which correspond to $SU(2)$ gauge-invariant discrete field theories.
The spin network states play a central role in the Loop Quantum Gravity (LQG) approach to the Planck scale physics.
arXiv Detail & Related papers (2024-10-24T14:59:35Z) - Bulk-boundary correspondence from hyper-invariant tensor networks [0.0]
We introduce a tensor network designed to faithfully simulate the AdS/CFT correspondence, akin to the multi-scale entanglement renormalization ansatz (MERA)
This framework accurately reproduces the boundary conformal field theory's (CFT) two- and three-point correlation functions, while considering the image of any bulk operator.
arXiv Detail & Related papers (2024-09-03T16:24:18Z) - A Hitchhiker's Guide to Geometric GNNs for 3D Atomic Systems [87.30652640973317]
Recent advances in computational modelling of atomic systems represent them as geometric graphs with atoms embedded as nodes in 3D Euclidean space.
Geometric Graph Neural Networks have emerged as the preferred machine learning architecture powering applications ranging from protein structure prediction to molecular simulations and material generation.
This paper provides a comprehensive and self-contained overview of the field of Geometric GNNs for 3D atomic systems.
arXiv Detail & Related papers (2023-12-12T18:44:19Z) - Gradient Descent in Neural Networks as Sequential Learning in RKBS [63.011641517977644]
We construct an exact power-series representation of the neural network in a finite neighborhood of the initial weights.
We prove that, regardless of width, the training sequence produced by gradient descent can be exactly replicated by regularized sequential learning.
arXiv Detail & Related papers (2023-02-01T03:18:07Z) - Holographic properties of superposed quantum geometries [0.0]
We study the holographic properties of a class of quantum geometry states characterized by a superposition of discrete geometric data.
This class includes spin networks, the kinematic states of lattice gauge theory and discrete quantum gravity.
arXiv Detail & Related papers (2022-07-15T17:37:47Z) - Boundary theories of critical matchgate tensor networks [59.433172590351234]
Key aspects of the AdS/CFT correspondence can be captured in terms of tensor network models on hyperbolic lattices.
For tensors fulfilling the matchgate constraint, these have previously been shown to produce disordered boundary states.
We show that these Hamiltonians exhibit multi-scale quasiperiodic symmetries captured by an analytical toy model.
arXiv Detail & Related papers (2021-10-06T18:00:03Z) - T-Basis: a Compact Representation for Neural Networks [89.86997385827055]
We introduce T-Basis, a concept for a compact representation of a set of tensors, each of an arbitrary shape, which is often seen in Neural Networks.
We evaluate the proposed approach on the task of neural network compression and demonstrate that it reaches high compression rates at acceptable performance drops.
arXiv Detail & Related papers (2020-07-13T19:03:22Z) - Tensor network models of AdS/qCFT [69.6561021616688]
We introduce the notion of a quasiperiodic conformal field theory (qCFT)
We show that qCFT can be best understood as belonging to a paradigm of discrete holography.
arXiv Detail & Related papers (2020-04-08T18:00:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.