Signatures of Chaos in Non-integrable Models of Quantum Field Theory
- URL: http://arxiv.org/abs/2012.08505v2
- Date: Thu, 1 Apr 2021 09:15:29 GMT
- Title: Signatures of Chaos in Non-integrable Models of Quantum Field Theory
- Authors: Miha Srdinsek, Tomaz Prosen, Spyros Sotiriadis
- Abstract summary: We study signatures of quantum chaos in (1+1)D Quantum Field Theory (QFT) models.
We focus on the double sine-Gordon, also studying the massive sine-Gordon and $phi4$ model.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study signatures of quantum chaos in (1+1)D Quantum Field Theory (QFT)
models. Our analysis is based on the method of Hamiltonian truncation, a
numerical approach for the construction of low-energy spectra and eigenstates
of QFTs that can be considered as perturbations of exactly solvable models. We
focus on the double sine-Gordon, also studying the massive sine-Gordon and
${\phi^4}$ model, all of which are non-integrable and can be studied by this
method with sufficiently high precision from small to intermediate perturbation
strength. We analyze the statistics of level spacings and of eigenvector
components, both of which are expected to follow Random Matrix Theory
predictions. While level spacing statistics are close to the Gaussian
Orthogonal Ensemble as expected, on the contrary, the eigenvector components
follow a distribution markedly different from the expected Gaussian. Unlike in
the typical quantum chaos scenario, the transition of level spacing statistics
to chaotic behaviour takes place already in the perturbative regime. On the
other hand, the distribution of eigenvector components does not appear to
change or approach Gaussian behaviour, even for relatively large perturbations.
Moreover, our results suggest that these features are independent of the choice
of model and basis.
Related papers
- Relaxation Fluctuations of Correlation Functions: Spin and Random Matrix Models [0.0]
We study the fluctuation average and variance of certain correlation functions as a diagnostic measure of quantum chaos.
We identify the three distinct phases of the models: the ergodic, the fractal, and the localized phases.
arXiv Detail & Related papers (2024-07-31T14:45:46Z) - Scaling and renormalization in high-dimensional regression [72.59731158970894]
This paper presents a succinct derivation of the training and generalization performance of a variety of high-dimensional ridge regression models.
We provide an introduction and review of recent results on these topics, aimed at readers with backgrounds in physics and deep learning.
arXiv Detail & Related papers (2024-05-01T15:59:00Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Regularized Vector Quantization for Tokenized Image Synthesis [126.96880843754066]
Quantizing images into discrete representations has been a fundamental problem in unified generative modeling.
deterministic quantization suffers from severe codebook collapse and misalignment with inference stage while quantization suffers from low codebook utilization and reconstruction objective.
This paper presents a regularized vector quantization framework that allows to mitigate perturbed above issues effectively by applying regularization from two perspectives.
arXiv Detail & Related papers (2023-03-11T15:20:54Z) - Learning Graphical Factor Models with Riemannian Optimization [70.13748170371889]
This paper proposes a flexible algorithmic framework for graph learning under low-rank structural constraints.
The problem is expressed as penalized maximum likelihood estimation of an elliptical distribution.
We leverage geometries of positive definite matrices and positive semi-definite matrices of fixed rank that are well suited to elliptical models.
arXiv Detail & Related papers (2022-10-21T13:19:45Z) - Leave-one-out Singular Subspace Perturbation Analysis for Spectral
Clustering [7.342677574855651]
The singular subspaces perturbation theory is of fundamental importance in probability and statistics.
We consider two arbitrary matrices where one is a leave-one-column-out submatrix of the other one.
It is well-suited for mixture models and results in a sharper and finer statistical analysis than classical perturbation bounds such as Wedin's Theorem.
arXiv Detail & Related papers (2022-05-30T05:07:09Z) - A Random Matrix Perspective on Random Tensors [40.89521598604993]
We study the spectra of random matrices arising from contractions of a given random tensor.
Our technique yields a hitherto unknown characterization of the local maximum of the ML problem.
Our approach is versatile and can be extended to other models, such as asymmetric, non-Gaussian and higher-order ones.
arXiv Detail & Related papers (2021-08-02T10:42:22Z) - Spectral clustering under degree heterogeneity: a case for the random
walk Laplacian [83.79286663107845]
This paper shows that graph spectral embedding using the random walk Laplacian produces vector representations which are completely corrected for node degree.
In the special case of a degree-corrected block model, the embedding concentrates about K distinct points, representing communities.
arXiv Detail & Related papers (2021-05-03T16:36:27Z) - Long-range level correlations in quantum systems with finite Hilbert
space dimension [0.0]
We study the spectral statistics of quantum systems with finite Hilbert spaces.
We derive a theorem showing that eigenlevels in such systems cannot be globally uncorrelated.
arXiv Detail & Related papers (2020-10-13T15:49:15Z) - Non-asymptotic Optimal Prediction Error for Growing-dimensional
Partially Functional Linear Models [0.951828574518325]
We show the rate-optimal upper and lower bounds of the prediction error.
An exact upper bound for the excess prediction risk is shown in a non-asymptotic form.
We derive the non-asymptotic minimax lower bound under the regularity assumption of the Kullback-Leibler divergence of the models.
arXiv Detail & Related papers (2020-09-10T08:49:32Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.