Dimension of Tensor Network varieties
- URL: http://arxiv.org/abs/2101.03148v2
- Date: Mon, 1 Aug 2022 18:34:31 GMT
- Title: Dimension of Tensor Network varieties
- Authors: Alessandra Bernardi, Claudia De Lazzari, Fulvio Gesmundo
- Abstract summary: We determine an upper bound on the dimension of the tensor network variety.
A refined upper bound is given in cases relevant for applications such as varieties of matrix product states and projected entangled pairs states.
- Score: 68.8204255655161
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The tensor network variety is a variety of tensors associated to a graph and
a set of positive integer weights on its edges, called bond dimensions. We
determine an upper bound on the dimension of the tensor network variety. A
refined upper bound is given in cases relevant for applications such as
varieties of matrix product states and projected entangled pairs states. We
provide a range (the "supercritical range") of the parameters where the upper
bound is sharp.
Related papers
- Compressing multivariate functions with tree tensor networks [0.0]
One-dimensional tensor networks are increasingly being used as a numerical ansatz for continuum functions.
We show how more structured tree tensor networks offer a significantly more efficient ansatz than the commonly used tensor train.
arXiv Detail & Related papers (2024-10-04T16:20:52Z) - Tensor cumulants for statistical inference on invariant distributions [49.80012009682584]
We show that PCA becomes computationally hard at a critical value of the signal's magnitude.
We define a new set of objects, which provide an explicit, near-orthogonal basis for invariants of a given degree.
It also lets us analyze a new problem of distinguishing between different ensembles.
arXiv Detail & Related papers (2024-04-29T14:33:24Z) - One-step replica symmetry breaking in the language of tensor networks [0.913755431537592]
We develop an exact mapping between the one-step replica symmetry breaking cavity method and tensor networks.
The two schemes come with complementary mathematical and numerical toolboxes that could be leveraged to improve the respective states of the art.
arXiv Detail & Related papers (2023-06-26T18:42:51Z) - Error Analysis of Tensor-Train Cross Approximation [88.83467216606778]
We provide accuracy guarantees in terms of the entire tensor for both exact and noisy measurements.
Results are verified by numerical experiments, and may have important implications for the usefulness of cross approximations for high-order tensors.
arXiv Detail & Related papers (2022-07-09T19:33:59Z) - Superposed Random Spin Tensor Networks and their Holographic Properties [0.0]
We study boundary-to-boundary holography in a class of spin network states defined by analogy to projected entangled pair states (PEPS)
We consider superpositions of states corresponding to well-defined, discrete geometries on a graph.
arXiv Detail & Related papers (2022-05-19T12:24:57Z) - Boundary theories of critical matchgate tensor networks [59.433172590351234]
Key aspects of the AdS/CFT correspondence can be captured in terms of tensor network models on hyperbolic lattices.
For tensors fulfilling the matchgate constraint, these have previously been shown to produce disordered boundary states.
We show that these Hamiltonians exhibit multi-scale quasiperiodic symmetries captured by an analytical toy model.
arXiv Detail & Related papers (2021-10-06T18:00:03Z) - Lower and Upper Bounds on the VC-Dimension of Tensor Network Models [8.997952791113232]
Network methods have been a key ingredient of advances in condensed matter physics.
They can be used to efficiently learn linear models in exponentially large feature spaces.
In this work, we derive upper and lower bounds on the VC dimension and pseudo-dimension of a large class of tensor network models.
arXiv Detail & Related papers (2021-06-22T14:39:25Z) - T-Basis: a Compact Representation for Neural Networks [89.86997385827055]
We introduce T-Basis, a concept for a compact representation of a set of tensors, each of an arbitrary shape, which is often seen in Neural Networks.
We evaluate the proposed approach on the task of neural network compression and demonstrate that it reaches high compression rates at acceptable performance drops.
arXiv Detail & Related papers (2020-07-13T19:03:22Z) - Optimization at the boundary of the tensor network variety [2.1839191255085995]
tensor network states form a variational ansatz class widely used in the study of quantum many-body systems.
Recent work has shown that states on the boundary of this variety can yield more efficient representations for states of physical interest.
We show how to optimize over this class in order to find ground states of local Hamiltonians.
arXiv Detail & Related papers (2020-06-30T16:58:55Z) - Gauge Equivariant Mesh CNNs: Anisotropic convolutions on geometric
graphs [81.12344211998635]
A common approach to define convolutions on meshes is to interpret them as a graph and apply graph convolutional networks (GCNs)
We propose Gauge Equivariant Mesh CNNs which generalize GCNs to apply anisotropic gauge equivariant kernels.
Our experiments validate the significantly improved expressivity of the proposed model over conventional GCNs and other methods.
arXiv Detail & Related papers (2020-03-11T17:21:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.