Petz reconstruction in random tensor networks
- URL: http://arxiv.org/abs/2006.12601v2
- Date: Wed, 8 Jul 2020 05:51:35 GMT
- Title: Petz reconstruction in random tensor networks
- Authors: Hewei Frederic Jia, Mukund Rangamani
- Abstract summary: We show how the Petz reconstruction map works to obtain bulk operators from the boundary data by exploiting the replica trick.
We also take the opportunity to comment on the differences between coarse-graining and random projections.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We illustrate the ideas of bulk reconstruction in the context of random
tensor network toy models of holography. Specifically, we demonstrate how the
Petz reconstruction map works to obtain bulk operators from the boundary data
by exploiting the replica trick. We also take the opportunity to comment on the
differences between coarse-graining and random projections.
Related papers
- Learning Gaussian Representation for Eye Fixation Prediction [54.88001757991433]
Existing eye fixation prediction methods perform the mapping from input images to the corresponding dense fixation maps generated from raw fixation points.
We introduce Gaussian Representation for eye fixation modeling.
We design our framework upon some lightweight backbones to achieve real-time fixation prediction.
arXiv Detail & Related papers (2024-03-21T20:28:22Z) - Explicit reconstruction of the entanglement wedge via the Petz map [0.0]
We revisit entanglement wedge reconstruction in AdS/CFT using the Petz recovery channel.
In the case of a spherical region on the boundary, we show that the Petz map reproduces the AdS-Rindler HKLL reconstruction.
arXiv Detail & Related papers (2022-10-02T18:56:56Z) - Bayesian Recurrent Units and the Forward-Backward Algorithm [91.39701446828144]
Using Bayes's theorem, we derive a unit-wise recurrence as well as a backward recursion similar to the forward-backward algorithm.
The resulting Bayesian recurrent units can be integrated as recurrent neural networks within deep learning frameworks.
Experiments on speech recognition indicate that adding the derived units at the end of state-of-the-art recurrent architectures can improve the performance at a very low cost in terms of trainable parameters.
arXiv Detail & Related papers (2022-07-21T14:00:52Z) - Entangled Residual Mappings [59.02488598557491]
We introduce entangled residual mappings to generalize the structure of the residual connections.
An entangled residual mapping replaces the identity skip connections with specialized entangled mappings.
We show that while entangled mappings can preserve the iterative refinement of features across various deep models, they influence the representation learning process in convolutional networks.
arXiv Detail & Related papers (2022-06-02T19:36:03Z) - Nonperturbative gravity corrections to bulk reconstruction [0.0]
We introduce a new framework for understanding nonperturbative gravitational aspects of bulk reconstruction with a finite or infinite-dimensional boundary Hilbert space.
We demonstrate that local operators in the reconstruction wedge of a given boundary region can be recovered in a state-independent way for arbitrarily large code subspaces.
arXiv Detail & Related papers (2021-12-23T18:59:59Z) - Approximating Invertible Maps by Recovery Channels: Optimality and an
Application to Non-Markovian Dynamics [68.8204255655161]
We investigate the problem of reversing quantum dynamics, specifically via optimal Petz recovery maps.
We focus on typical decoherence channels, such as dephasing, depolarizing and amplitude damping.
We extend this idea to explore the use of recovery maps as an approximation of inverse maps, and apply it in the context of non-Markovian dynamics.
arXiv Detail & Related papers (2021-11-04T16:16:45Z) - Reconstructing group wavelet transform from feature maps with a
reproducing kernel iteration [0.0]
We consider the problem of reconstructing an image that is downsampled in the space of its $SE(2)$ wavelet transform.
We prove that, whenever the problem is solvable, the reconstruction can be obtained by an elementary project.
arXiv Detail & Related papers (2021-10-01T18:15:18Z) - Temporally-Coherent Surface Reconstruction via Metric-Consistent Atlases [131.50372468579067]
We represent the reconstructed surface as an atlas, using a neural network.
We empirically show that our method achieves results that exceed that state of the art in the accuracy of unsupervised correspondences and accuracy of surface reconstruction.
arXiv Detail & Related papers (2021-04-14T16:21:22Z) - Invertible Neural Networks versus MCMC for Posterior Reconstruction in
Grazing Incidence X-Ray Fluorescence [0.3232625980782302]
We propose to reconstruct the posterior parameter distribution given a noisy measurement generated by the forward model by an appropriately learned invertible neural network.
We demonstrate by numerical comparisons that our method can compete with established Markov Chain Monte Carlo approaches, while being more efficient and flexible in applications.
arXiv Detail & Related papers (2021-02-05T14:17:59Z) - CR-Fill: Generative Image Inpainting with Auxiliary Contexutal
Reconstruction [143.7271816543372]
We propose to teach such patch-borrowing behavior to an attention-free generator by joint training of an auxiliary contextual reconstruction task.
The auxiliary branch can be seen as a learnable loss function, where query-reference feature similarity and reference-based reconstructor are jointly optimized with the inpainting generator.
Experimental results demonstrate that the proposed inpainting model compares favourably against the state-of-the-art in terms of quantitative and visual performance.
arXiv Detail & Related papers (2020-11-25T15:45:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.