Krylov complexity in the IP matrix model II
- URL: http://arxiv.org/abs/2308.07567v1
- Date: Tue, 15 Aug 2023 04:25:55 GMT
- Title: Krylov complexity in the IP matrix model II
- Authors: Norihiro Iizuka, Mitsuhiro Nishida
- Abstract summary: We study how the Krylov complexity changes from a zero-temperature oscillation to an infinite-temperature exponential growth.
The IP model for any nonzero temperature shows exponential growth for the Krylov complexity even though the Green function decays by a power law in time.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We continue the analysis of the Krylov complexity in the IP matrix model. In
a previous paper, for a fundamental operator, it was shown that at zero
temperature, the Krylov complexity oscillates and does not grow, but in the
infinite temperature limit, the Krylov complexity grows exponentially in time
as $\sim \exp\left( {\mathcal{O}\left( {\sqrt{t}}\right)} \right)$. We study
how the Krylov complexity changes from a zero-temperature oscillation to an
infinite-temperature exponential growth. At low temperatures, the spectral
density is approximated as collections of infinite Wigner semicircles. We
showed that this infinite collection of branch cuts yields linear growth to the
Lanczos coefficients and gives exponential growth of the Krylov complexity.
Thus the IP model for any nonzero temperature shows exponential growth for the
Krylov complexity even though the Green function decays by a power law in time.
We also study the Lanczos coefficients and the Krylov complexity in the IOP
matrix model taking into account the $1/N^2$ corrections. There, the Lanczos
coefficients are constants and the Krylov complexity does not grow
exponentially as expected.
Related papers
- Krylov complexity of fermion chain in double-scaled SYK and power spectrum perspective [0.0]
We investigate Krylov complexity of the fermion chain operator which consists of multiple Majorana fermions in the double-scaled SYK (DSSYK) model with finite temperature.
Using the fact that Krylov complexity is computable from two-point functions, the analysis is performed in the limit where the two-point function becomes simple.
We confirm the exponential growth of Krylov complexity in the very low temperature regime.
arXiv Detail & Related papers (2024-07-18T08:47:05Z) - KPZ scaling from the Krylov space [83.88591755871734]
Recently, a superdiffusion exhibiting the Kardar-Parisi-Zhang scaling in late-time correlators and autocorrelators has been reported.
Inspired by these results, we explore the KPZ scaling in correlation functions using their realization in the Krylov operator basis.
arXiv Detail & Related papers (2024-06-04T20:57:59Z) - Inflationary complexity of thermal state [3.0346001106791323]
We investigate inflationary complexity of the two-mode squeezed state with thermal effect for the single field inflation, modified dispersion relation, and non-trivial sound speed.
Our investigations show the evolution of Krylov complexity will enhance upon some peaks factoring in the thermal effects.
Our derivation for the Krylov complexity and Krylov entropy could nicely recover into the case of closed system.
arXiv Detail & Related papers (2024-05-02T16:22:59Z) - Learning with Norm Constrained, Over-parameterized, Two-layer Neural Networks [54.177130905659155]
Recent studies show that a reproducing kernel Hilbert space (RKHS) is not a suitable space to model functions by neural networks.
In this paper, we study a suitable function space for over- parameterized two-layer neural networks with bounded norms.
arXiv Detail & Related papers (2024-04-29T15:04:07Z) - Inflationary Krylov complexity [3.0346001106791323]
We investigate the Krylov complexity of curvature perturbation for the modified dispersion relation in inflation.
Our analysis could be applied to the most inflationary models.
arXiv Detail & Related papers (2024-01-17T16:17:51Z) - Krylov complexity as an order parameter for deconfinement phase
transitions at large $N$ [0.0]
Krylov complexity is an order parameter of confinement/deconfinement transitions in large $N$ quantum field theories.
We show that Krylov complexity reflects the confinement/deconfinement phase transitions through the continuity of mass spectrum.
arXiv Detail & Related papers (2024-01-09T07:04:17Z) - Krylov complexity in quantum field theory, and beyond [44.99833362998488]
We study Krylov complexity in various models of quantum field theory.
We find that the exponential growth of Krylov complexity satisfies the conjectural inequality, which generalizes the Maldacena-Shenker-Stanford bound on chaos.
arXiv Detail & Related papers (2022-12-29T19:00:00Z) - Krylov complexity in large-$q$ and double-scaled SYK model [0.0]
We compute Krylov complexity and the higher Krylov cumulants in subleading order, along with the $t/q$ effects.
The Krylov complexity naturally describes the "size" of the distribution, while the higher cumulants encode richer information.
The growth of Krylov complexity appears to be "hyperfast", which is previously conjectured to be associated with scrambling in de Sitter space.
arXiv Detail & Related papers (2022-10-05T18:00:11Z) - Out-of-time-order correlations and the fine structure of eigenstate
thermalisation [58.720142291102135]
Out-of-time-orderors (OTOCs) have become established as a tool to characterise quantum information dynamics and thermalisation.
We show explicitly that the OTOC is indeed a precise tool to explore the fine details of the Eigenstate Thermalisation Hypothesis (ETH)
We provide an estimation of the finite-size scaling of $omega_textrmGOE$ for the general class of observables composed of sums of local operators in the infinite-temperature regime.
arXiv Detail & Related papers (2021-03-01T17:51:46Z) - On Function Approximation in Reinforcement Learning: Optimism in the
Face of Large State Spaces [208.67848059021915]
We study the exploration-exploitation tradeoff at the core of reinforcement learning.
In particular, we prove that the complexity of the function class $mathcalF$ characterizes the complexity of the function.
Our regret bounds are independent of the number of episodes.
arXiv Detail & Related papers (2020-11-09T18:32:22Z) - Measuring Model Complexity of Neural Networks with Curve Activation
Functions [100.98319505253797]
We propose the linear approximation neural network (LANN) to approximate a given deep model with curve activation function.
We experimentally explore the training process of neural networks and detect overfitting.
We find that the $L1$ and $L2$ regularizations suppress the increase of model complexity.
arXiv Detail & Related papers (2020-06-16T07:38:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.