Growth of a renormalized operator as a probe of chaos
- URL: http://arxiv.org/abs/2110.15306v2
- Date: Sat, 12 Nov 2022 05:51:54 GMT
- Title: Growth of a renormalized operator as a probe of chaos
- Authors: Xing Huang and Binchao Zhang
- Abstract summary: We propose that the size of an operator evolved under holographic renormalization group flow shall grow linearly with the scale.
To test this conjecture, we study the operator growth in two different toy models.
- Score: 0.2741266294612776
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose that the size of an operator evolved under holographic
renormalization group flow shall grow linearly with the scale and interpret
this behavior as a manifestation of the saturation of the chaos bound. To test
this conjecture, we study the operator growth in two different toy models. The
first is a MERA-like tensor network built from a random unitary circuit with
the operator size defined using the integrated out-of-time-ordered correlator
(OTOC). The second model is an error-correcting code of perfect tensors, and
the operator size is computed using the number of single-site physical
operators that realize the logical operator. In both cases, we observe linear
growth.
Related papers
- Operator Space Entangling Power of Quantum Dynamics and Local Operator Entanglement Growth in Dual-Unitary Circuits [0.0]
We introduce a measure for the ability of a unitary channel to generate operator entanglement, representing an operator-level generalization of the state-space entangling power.
For dual-unitary circuits, a combination of analytical and numerical investigations demonstrates that the average growth of local operator entanglement exhibits two distinct regimes.
arXiv Detail & Related papers (2024-06-14T17:40:53Z) - Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - Operator Learning Renormalization Group [0.8192907805418583]
We present a general framework for quantum many-body simulations called the operator learning renormalization group (OLRG)
Inspired by machine learning perspectives, OLRG is a generalization of Wilson's numerical renormalization group and White's density matrix renormalization group.
We implement two versions of the operator maps for classical and quantum simulations.
arXiv Detail & Related papers (2024-03-05T18:37:37Z) - Operator dynamics in Lindbladian SYK: a Krylov complexity perspective [0.0]
We analytically establish the linear growth of two sets of coefficients for any generic jump operators.
We find that the Krylov complexity saturates inversely with the dissipation strength, while the dissipative timescale grows logarithmically.
arXiv Detail & Related papers (2023-11-01T18:00:06Z) - Emergence of Grid-like Representations by Training Recurrent Networks
with Conformal Normalization [48.99772993899573]
We study the emergence of hexagon grid patterns of grid cells based on a general recurrent neural network model.
We propose a simple yet general conformal normalization of the input velocity of the RNN.
We conduct extensive experiments to verify that conformal normalization is crucial for the emergence of hexagon grid patterns.
arXiv Detail & Related papers (2023-10-29T23:12:56Z) - Operator growth in 2d CFT [0.0]
We investigate and characterize the dynamics of operator growth in irrational two-dimensional conformal field theories.
We implement the Lanczos algorithm and evaluate the Krylov of complexity under a unitary evolution protocol.
arXiv Detail & Related papers (2021-10-20T12:12:48Z) - Topographic VAEs learn Equivariant Capsules [84.33745072274942]
We introduce the Topographic VAE: a novel method for efficiently training deep generative models with topographically organized latent variables.
We show that such a model indeed learns to organize its activations according to salient characteristics such as digit class, width, and style on MNIST.
We demonstrate approximate equivariance to complex transformations, expanding upon the capabilities of existing group equivariant neural networks.
arXiv Detail & Related papers (2021-09-03T09:25:57Z) - Do Generative Models Know Disentanglement? Contrastive Learning is All
You Need [59.033559925639075]
We propose an unsupervised and model-agnostic method: Disentanglement via Contrast (DisCo) in the Variation Space.
DisCo achieves the state-of-the-art disentanglement given pretrained non-disentangled generative models, including GAN, VAE, and Flow.
arXiv Detail & Related papers (2021-02-21T08:01:20Z) - Relevant OTOC operators: footprints of the classical dynamics [68.8204255655161]
The OTOC-RE theorem relates the OTOCs summed over a complete base of operators to the second Renyi entropy.
We show that the sum over a small set of relevant operators, is enough in order to obtain a very good approximation for the entropy.
In turn, this provides with an alternative natural indicator of complexity, i.e. the scaling of the number of relevant operators with time.
arXiv Detail & Related papers (2020-07-31T19:23:26Z) - The Generalized Lasso with Nonlinear Observations and Generative Priors [63.541900026673055]
We make the assumption of sub-Gaussian measurements, which is satisfied by a wide range of measurement models.
We show that our result can be extended to the uniform recovery guarantee under the assumption of a so-called local embedding property.
arXiv Detail & Related papers (2020-06-22T16:43:35Z) - On operator growth and emergent Poincar\'e symmetries [0.0]
We consider operator growth for generic large-N gauge theories at finite temperature.
The algebra of these modes allows for a simple analysis of the operators with whom the initial operator mixes over time.
We show all these approaches have a natural formulation in terms of the Gelfand-Naimark-Segal (GNS) construction.
arXiv Detail & Related papers (2020-02-10T15:29:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.