Emergent Einstein Equation in p-adic CFT Tensor Networks
- URL: http://arxiv.org/abs/2102.12022v2
- Date: Mon, 8 Mar 2021 13:25:29 GMT
- Title: Emergent Einstein Equation in p-adic CFT Tensor Networks
- Authors: Lin Chen, Xirong Liu and Ling-Yan Hung
- Abstract summary: We show that a deformed Bruhat-Tits tree satisfies an emergent graph Einstein equation in a unique way.
This could provide new insights into the understanding of gravitational dynamics potentially encoded in more general tensor networks.
- Score: 6.127256542161883
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We take the tensor network describing explicit p-adic CFT partition functions
proposed in [1], and considered boundary conditions of the network describing a
deformed Bruhat-Tits (BT) tree geometry. We demonstrate that this geometry
satisfies an emergent graph Einstein equation in a unique way that is
consistent with the bulk effective matter action encoding the same correlation
function as the tensor network, at least in the perturbative limit order by
order away from the pure BT tree. Moreover, the (perturbative) definition of
the graph curvature in the Mathematics literature naturally emerges from the
consistency requirements of the emergent Einstein equation. This could provide
new insights into the understanding of gravitational dynamics potentially
encoded in more general tensor networks.
Related papers
- Graph theory and tunable slow dynamics in quantum East Hamiltonians [0.0]
We show how graph theory concepts can provide an insight into the origin of slow dynamics in systems with kinetic constraints.
Slow dynamics is related to the presence of strong hierarchies between nodes on the Fock-space graph.
We numerically demonstrate how these detunings affect the degree of non-ergodicity on finite systems.
arXiv Detail & Related papers (2025-04-04T14:08:18Z) - Bulk-boundary correspondence from hyper-invariant tensor networks [0.0]
We introduce a tensor network designed to faithfully simulate the AdS/CFT correspondence, akin to the multi-scale entanglement renormalization ansatz (MERA)
This framework accurately reproduces the boundary conformal field theory's (CFT) two- and three-point correlation functions, while considering the image of any bulk operator.
arXiv Detail & Related papers (2024-09-03T16:24:18Z) - Theory on variational high-dimensional tensor networks [2.0307382542339485]
We investigate the emergent statistical properties of random high-dimensional-network states and the trainability of tensoral networks.
We prove that variational high-dimensional networks suffer from barren plateaus for global loss functions.
Our results pave a way for their future theoretical studies and practical applications.
arXiv Detail & Related papers (2023-03-30T15:26:30Z) - Holographic properties of superposed quantum geometries [0.0]
We study the holographic properties of a class of quantum geometry states characterized by a superposition of discrete geometric data.
This class includes spin networks, the kinematic states of lattice gauge theory and discrete quantum gravity.
arXiv Detail & Related papers (2022-07-15T17:37:47Z) - Hyper-optimized approximate contraction of tensor networks with
arbitrary geometry [0.0]
We describe how to approximate tensor network contraction through bond compression on arbitrary graphs.
In particular, we introduce a hyper-optimization over the compression and contraction strategy itself to minimize error and cost.
arXiv Detail & Related papers (2022-06-14T17:59:16Z) - Equivariant Graph Mechanics Networks with Constraints [83.38709956935095]
We propose Graph Mechanics Network (GMN) which is efficient, equivariant and constraint-aware.
GMN represents, by generalized coordinates, the forward kinematics information (positions and velocities) of a structural object.
Extensive experiments support the advantages of GMN compared to the state-of-the-art GNNs in terms of prediction accuracy, constraint satisfaction and data efficiency.
arXiv Detail & Related papers (2022-03-12T14:22:14Z) - Decimation technique for open quantum systems: a case study with
driven-dissipative bosonic chains [62.997667081978825]
Unavoidable coupling of quantum systems to external degrees of freedom leads to dissipative (non-unitary) dynamics.
We introduce a method to deal with these systems based on the calculation of (dissipative) lattice Green's function.
We illustrate the power of this method with several examples of driven-dissipative bosonic chains of increasing complexity.
arXiv Detail & Related papers (2022-02-15T19:00:09Z) - Boundary theories of critical matchgate tensor networks [59.433172590351234]
Key aspects of the AdS/CFT correspondence can be captured in terms of tensor network models on hyperbolic lattices.
For tensors fulfilling the matchgate constraint, these have previously been shown to produce disordered boundary states.
We show that these Hamiltonians exhibit multi-scale quasiperiodic symmetries captured by an analytical toy model.
arXiv Detail & Related papers (2021-10-06T18:00:03Z) - Bending the Bruhat-Tits Tree I:Tensor Network and Emergent Einstein
Equations [6.127256542161883]
We show how a p-adic CFT encodes geometric information of a dual geometry even as we deform the CFT away from the fixed point.
This is perhaps a first quantitative demonstration that a concrete Einstein equation can be extracted directly from the tensor network.
arXiv Detail & Related papers (2021-02-24T02:03:54Z) - Gauge Equivariant Mesh CNNs: Anisotropic convolutions on geometric
graphs [81.12344211998635]
A common approach to define convolutions on meshes is to interpret them as a graph and apply graph convolutional networks (GCNs)
We propose Gauge Equivariant Mesh CNNs which generalize GCNs to apply anisotropic gauge equivariant kernels.
Our experiments validate the significantly improved expressivity of the proposed model over conventional GCNs and other methods.
arXiv Detail & Related papers (2020-03-11T17:21:15Z) - Neural Operator: Graph Kernel Network for Partial Differential Equations [57.90284928158383]
This work is to generalize neural networks so that they can learn mappings between infinite-dimensional spaces (operators)
We formulate approximation of the infinite-dimensional mapping by composing nonlinear activation functions and a class of integral operators.
Experiments confirm that the proposed graph kernel network does have the desired properties and show competitive performance compared to the state of the art solvers.
arXiv Detail & Related papers (2020-03-07T01:56:20Z) - Understanding Graph Neural Networks with Generalized Geometric
Scattering Transforms [67.88675386638043]
The scattering transform is a multilayered wavelet-based deep learning architecture that acts as a model of convolutional neural networks.
We introduce windowed and non-windowed geometric scattering transforms for graphs based upon a very general class of asymmetric wavelets.
We show that these asymmetric graph scattering transforms have many of the same theoretical guarantees as their symmetric counterparts.
arXiv Detail & Related papers (2019-11-14T17:23:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.