A class of 2 X 2 correlated random-matrix models with Brody spacing distribution
- URL: http://arxiv.org/abs/2308.01514v3
- Date: Fri, 6 Sep 2024 06:19:40 GMT
- Title: A class of 2 X 2 correlated random-matrix models with Brody spacing distribution
- Authors: Jamal Sakhr,
- Abstract summary: A class of 2 X 2 random-matrix models is introduced for which the Brody distribution is the eigenvalue spacing distribution.
The random matrices introduced here differ from those of the Gaussian Orthogonal Ensemble (GOE) in three important ways.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A class of 2 X 2 random-matrix models is introduced for which the Brody distribution is the exact eigenvalue spacing distribution. The matrix elements consist of constrained sums of an exponential random variable raised to various powers that depend on the Brody parameter. The random matrices introduced here differ from those of the Gaussian Orthogonal Ensemble (GOE) in three important ways: the matrix elements are not independent and identically distributed (i.e., not IID) nor Gaussian-distributed, and the matrices are not necessarily real and/or symmetric. The first two features arise from dropping the classical independence assumption, and the third feature stems from dropping the quantum-mechanical conditions that are imposed in the construction of the GOE. In particular, the hermiticity condition, which in the present model, is a sufficient but not necessary condition for the eigenvalues to be real, is not imposed. Consequently, complex non-Hermitian 2 X 2 random matrices with real or complex eigenvalues can also have spacing distributions that are intermediate between those of the Poisson and Wigner classes. Numerical examples are provided for different types of random matrices, including complex-symmetric matrices with real or complex-conjugate eigenvalues.
Related papers
- Efficient conversion from fermionic Gaussian states to matrix product states [48.225436651971805]
We propose a highly efficient algorithm that converts fermionic Gaussian states to matrix product states.
It can be formulated for finite-size systems without translation invariance, but becomes particularly appealing when applied to infinite systems.
The potential of our method is demonstrated by numerical calculations in two chiral spin liquids.
arXiv Detail & Related papers (2024-08-02T10:15:26Z) - Entrywise error bounds for low-rank approximations of kernel matrices [55.524284152242096]
We derive entrywise error bounds for low-rank approximations of kernel matrices obtained using the truncated eigen-decomposition.
A key technical innovation is a delocalisation result for the eigenvectors of the kernel matrix corresponding to small eigenvalues.
We validate our theory with an empirical study of a collection of synthetic and real-world datasets.
arXiv Detail & Related papers (2024-05-23T12:26:25Z) - A Result About the Classification of Quantum Covariance Matrices Based
on Their Eigenspectra [0.0]
We find a non-trivial class of eigenspectra with the property that the set of quantum covariance matrices corresponding to any eigenspectrum in this class are related by symplectic transformations.
We show that all non-degenerate eigenspectra with this property must belong to this class, and that the set of such eigenspectra coincides with the class of non-degenerate eigenspectra.
arXiv Detail & Related papers (2023-08-07T09:40:09Z) - $h(1) \oplus su(2)$ vector algebra eigenstates with eigenvalues in the
matrix domain [0.0]
We find a subset of generalized vector coherent states in the matrix domain.
For a special choice of the matrix eigenvalue parameters we found the so-called vector coherent states with matrices associated to the Heisenberg-Weyl group.
arXiv Detail & Related papers (2023-01-25T18:10:01Z) - The Ordered Matrix Dirichlet for Modeling Ordinal Dynamics [54.96229007229786]
We propose the Ordered Matrix Dirichlet (OMD) to map latent states to observed action types.
Models built on the OMD recover interpretable latent states and show superior forecasting performance in few-shot settings.
arXiv Detail & Related papers (2022-12-08T08:04:26Z) - When Random Tensors meet Random Matrices [50.568841545067144]
This paper studies asymmetric order-$d$ spiked tensor models with Gaussian noise.
We show that the analysis of the considered model boils down to the analysis of an equivalent spiked symmetric textitblock-wise random matrix.
arXiv Detail & Related papers (2021-12-23T04:05:01Z) - Test Set Sizing Via Random Matrix Theory [91.3755431537592]
This paper uses techniques from Random Matrix Theory to find the ideal training-testing data split for a simple linear regression.
It defines "ideal" as satisfying the integrity metric, i.e. the empirical model error is the actual measurement noise.
This paper is the first to solve for the training and test size for any model in a way that is truly optimal.
arXiv Detail & Related papers (2021-12-11T13:18:33Z) - Single-particle eigenstate thermalization in quantum-chaotic quadratic
Hamiltonians [4.557919434849493]
We study the matrix elements of local and nonlocal operators in the single-particle eigenstates of two paradigmatic quantum-chaotic quadratic Hamiltonians.
We show that the diagonal matrix elements exhibit vanishing eigenstate-to-eigenstate fluctuations, and a variance proportional to the inverse Hilbert space dimension.
arXiv Detail & Related papers (2021-09-14T18:00:13Z) - Learning with Density Matrices and Random Features [44.98964870180375]
A density matrix describes the statistical state of a quantum system.
It is a powerful formalism to represent both the quantum and classical uncertainty of quantum systems.
This paper explores how density matrices can be used as a building block for machine learning models.
arXiv Detail & Related papers (2021-02-08T17:54:59Z) - On Random Matrices Arising in Deep Neural Networks: General I.I.D. Case [0.0]
We study the distribution of singular values of product of random matrices pertinent to the analysis of deep neural networks.
We use another, more streamlined, version of the techniques of random matrix theory to generalize the results of [22] to the case where the entries of the synaptic weight matrices are just independent identically distributed random variables with zero mean and finite fourth moment.
arXiv Detail & Related papers (2020-11-20T14:39:24Z) - On Random Matrices Arising in Deep Neural Networks. Gaussian Case [1.6244541005112747]
The paper deals with distribution of singular values of product of random matrices arising in the analysis of deep neural networks.
The problem has been considered in recent work by using the techniques of free probability theory.
arXiv Detail & Related papers (2020-01-17T08:30:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.