Categorical relations and bipartite entanglement in tensor cones for
Toeplitz and Fej\'er-Riesz operator systems
- URL: http://arxiv.org/abs/2312.01462v1
- Date: Sun, 3 Dec 2023 17:15:41 GMT
- Title: Categorical relations and bipartite entanglement in tensor cones for
Toeplitz and Fej\'er-Riesz operator systems
- Authors: Douglas Farenick
- Abstract summary: This paper aims to understand separability and entanglement in tensor cones, in the sense of Namioka and Phelps.
Toeplitz and Fej'er-Riesz operator systems are of particular interest.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The present paper aims to understand separability and entanglement in tensor
cones, in the sense of Namioka and Phelps, that arise from the base cones of
operator system tensor products. Of particular interest here are the Toeplitz
and Fej\'er-Riesz operator systems, which are, respectively, operator systems
of Toeplitz matrices and Laurent polynomials (that is, trigonometric
polynomials), and which are related in the operator system category through
duality. Some notable categorical relationships established in this paper are
the C$^*$-nuclearity of Toeplitz and Fej\'er-Riesz operator systems, as well as
their unique operator system structures when tensoring with injective operator
systems. Among the results of this study are two of independent interest: (i) a
matrix criterion, similar to the one involving the Choi matrix, for a linear
map of the Fej\'er-Riesz operator system to be completely positive; (ii) a
completely positive extension theorem for positive linear maps of $n\times n$
Toeplitz matrices into arbritary von Neumann algebras, thereby showing that a
similar extension theorem of Haagerup for $2\times 2$ Toeplitz matrices holds
for Toeplitz matrices of higher dimension.
Related papers
- Understanding Matrix Function Normalizations in Covariance Pooling through the Lens of Riemannian Geometry [63.694184882697435]
Global Covariance Pooling (GCP) has been demonstrated to improve the performance of Deep Neural Networks (DNNs) by exploiting second-order statistics of high-level representations.
arXiv Detail & Related papers (2024-07-15T07:11:44Z) - Tensor cumulants for statistical inference on invariant distributions [49.80012009682584]
We show that PCA becomes computationally hard at a critical value of the signal's magnitude.
We define a new set of objects, which provide an explicit, near-orthogonal basis for invariants of a given degree.
It also lets us analyze a new problem of distinguishing between different ensembles.
arXiv Detail & Related papers (2024-04-29T14:33:24Z) - Beyond Operator Systems [0.0]
Operator systems connect operator algebra, free semialgebraic geometry and quantum information theory.
In this work we generalize operator systems and many of their theorems.
arXiv Detail & Related papers (2023-12-21T16:16:27Z) - Operator Systems Generated by Projections [3.8073142980733]
We construct a family of operator systems and $k$-AOU spaces generated by a finite number of projections satisfying a set of linear relations.
By choosing the linear relations to be the nonsignalling relations from quantum correlation theory, we obtain a hierarchy of ordered vector spaces dual to the hierarchy of quantum correlation sets.
arXiv Detail & Related papers (2023-02-25T01:33:39Z) - Deep Learning Symmetries and Their Lie Groups, Algebras, and Subalgebras
from First Principles [55.41644538483948]
We design a deep-learning algorithm for the discovery and identification of the continuous group of symmetries present in a labeled dataset.
We use fully connected neural networks to model the transformations symmetry and the corresponding generators.
Our study also opens the door for using a machine learning approach in the mathematical study of Lie groups and their properties.
arXiv Detail & Related papers (2023-01-13T16:25:25Z) - An Equivalence Principle for the Spectrum of Random Inner-Product Kernel
Matrices with Polynomial Scalings [21.727073594338297]
This study is motivated by applications in machine learning and statistics.
We establish the weak limit of the empirical distribution of these random matrices in a scaling regime.
Our results can be characterized as the free additive convolution between a Marchenko-Pastur law and a semicircle law.
arXiv Detail & Related papers (2022-05-12T18:50:21Z) - Matrix quantum groups as matrix product operator representations of Lie
groups [0.0]
We show that the matrix quantum group $SL_q(2)$ gives rise to nontrivial matrix product operator representations of the Lie group $SL(2)$.
We argue that the combination of this data with the well known $q$-deformed Clebsch-Gordan coefficients and 6j-symbols is consistent with a description of this quantum group in terms of bimodule categories.
arXiv Detail & Related papers (2022-02-14T18:53:38Z) - Self-Adjointness of Toeplitz Operators on the Segal-Bargmann Space [62.997667081978825]
We prove a new criterion that guarantees self-adjointness of Toeplitz operator with unbounded operator-valued symbols.
We extend the Berger-Coburn estimate to the case of vector-valued Segal-Bargmann spaces.
arXiv Detail & Related papers (2022-02-09T19:14:13Z) - Dualities in one-dimensional quantum lattice models: symmetric
Hamiltonians and matrix product operator intertwiners [0.0]
We present a systematic recipe for generating and classifying duality transformations in one-dimensional quantum lattice systems.
Our construction emphasizes the role of global symmetries, including those described by (non)-abelian groups.
We illustrate this approach for known dualities such as Kramers-Wannier, Jordan-Wigner, Kennedy-Tasaki and the IRF-vertex correspondence.
arXiv Detail & Related papers (2021-12-16T18:22:49Z) - Finite-Function-Encoding Quantum States [52.77024349608834]
We introduce finite-function-encoding (FFE) states which encode arbitrary $d$-valued logic functions.
We investigate some of their structural properties.
arXiv Detail & Related papers (2020-12-01T13:53:23Z) - Random Matrix Theory Proves that Deep Learning Representations of
GAN-data Behave as Gaussian Mixtures [44.06610082529756]
Deep learning representations of data produced by generative adversarial nets (GANs) are random vectors which fall within the class of so-called textitconcentrated random vectors.
Our theoretical findings are validated by generating images with the BigGAN model and across different popular deep representation networks.
arXiv Detail & Related papers (2020-01-21T22:17:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.