The product structure of MPS-under-permutations
- URL: http://arxiv.org/abs/2410.19541v1
- Date: Fri, 25 Oct 2024 13:13:23 GMT
- Title: The product structure of MPS-under-permutations
- Authors: Marta Florido-Llinàs, Álvaro M. Alhambra, Rahul Trivedi, Norbert Schuch, David Pérez-García, J. Ignacio Cirac,
- Abstract summary: We show that translationally-invariant (TI) matrix product states (MPS) with this property are trivial.
The results also apply to non-TI generic MPS, as well as further relevant examples of MPS including the W state and the Dicke states in an approximate sense.
- Score: 0.4837943644708207
- License:
- Abstract: Tensor network methods have proved to be highly effective in addressing a wide variety of physical scenarios, including those lacking an intrinsic one-dimensional geometry. In such contexts, it is possible for the problem to exhibit a weak form of permutational symmetry, in the sense that entanglement behaves similarly across any arbitrary bipartition. In this paper, we show that translationally-invariant (TI) matrix product states (MPS) with this property are trivial, meaning that they are either product states or superpositions of a few of them. The results also apply to non-TI generic MPS, as well as further relevant examples of MPS including the W state and the Dicke states in an approximate sense. Our findings motivate the usage of ans\"atze simpler than tensor networks in systems whose structure is invariant under permutations.
Related papers
- Internal structure of gauge-invariant Projected Entangled Pair States [0.0]
Projected entangled pair states (PEPS) allow encoding symmetries, either global or local (gauge), naturally.
PEPS with local symmetries have increasingly been used in the study of non-perturbative regimes of lattice gauge theories.
We study the internal structure of projected entangled pair states with a gauge symmetry.
arXiv Detail & Related papers (2024-10-24T17:37:37Z) - Classifying symmetric and symmetry-broken spin chain phases with anomalous group actions [0.0]
We consider the classification problem of quantum spin chains invariant under local decomposable group actions.
We derive invariants for our classification that naturally cover one-dimensional symmetry protected topological phases.
arXiv Detail & Related papers (2024-03-27T13:54:45Z) - A Characterization Theorem for Equivariant Networks with Point-wise
Activations [13.00676132572457]
We prove that rotation-equivariant networks can only be invariant, as it happens for any network which is equivariant with respect to connected compact groups.
We show that feature spaces of disentangled steerable convolutional neural networks are trivial representations.
arXiv Detail & Related papers (2024-01-17T14:30:46Z) - Equivariant Transduction through Invariant Alignment [71.45263447328374]
We introduce a novel group-equivariant architecture that incorporates a group-in hard alignment mechanism.
We find that our network's structure allows it to develop stronger equivariant properties than existing group-equivariant approaches.
We additionally find that it outperforms previous group-equivariant networks empirically on the SCAN task.
arXiv Detail & Related papers (2022-09-22T11:19:45Z) - Geometric and Physical Quantities improve E(3) Equivariant Message
Passing [59.98327062664975]
We introduce Steerable E(3) Equivariant Graph Neural Networks (SEGNNs) that generalise equivariant graph networks.
This model, composed of steerables, is able to incorporate geometric and physical information in both the message and update functions.
We demonstrate the effectiveness of our method on several tasks in computational physics and chemistry.
arXiv Detail & Related papers (2021-10-06T16:34:26Z) - Topographic VAEs learn Equivariant Capsules [84.33745072274942]
We introduce the Topographic VAE: a novel method for efficiently training deep generative models with topographically organized latent variables.
We show that such a model indeed learns to organize its activations according to salient characteristics such as digit class, width, and style on MNIST.
We demonstrate approximate equivariance to complex transformations, expanding upon the capabilities of existing group equivariant neural networks.
arXiv Detail & Related papers (2021-09-03T09:25:57Z) - Group Equivariant Subsampling [60.53371517247382]
Subsampling is used in convolutional neural networks (CNNs) in the form of pooling or strided convolutions.
We first introduce translation equivariant subsampling/upsampling layers that can be used to construct exact translation equivariant CNNs.
We then generalise these layers beyond translations to general groups, thus proposing group equivariant subsampling/upsampling.
arXiv Detail & Related papers (2021-06-10T16:14:00Z) - Efficient multi port-based teleportation schemes [0.10427337206896375]
Scheme allows for transmitting more than one unknown quantum state in one go.
New scheme gives better performance than various variants of the optimal PBT protocol used for the same task.
I turns out that the introduced formalism, and symmetries beneath it, appears in many aspects of theoretical physics and mathematics.
arXiv Detail & Related papers (2020-08-03T16:09:51Z) - MDP Homomorphic Networks: Group Symmetries in Reinforcement Learning [90.20563679417567]
This paper introduces MDP homomorphic networks for deep reinforcement learning.
MDP homomorphic networks are neural networks that are equivariant under symmetries in the joint state-action space of an MDP.
We show that such networks converge faster than unstructured networks on CartPole, a grid world and Pong.
arXiv Detail & Related papers (2020-06-30T15:38:37Z) - Sum-Product-Transform Networks: Exploiting Symmetries using Invertible
Transformations [1.539942973115038]
Sum-Product-Transform Networks (SPTN) is an extension of sum-product networks that uses invertible transformations as additional internal nodes.
G-SPTNs achieve state-of-the-art results on the density estimation task and are competitive with state-of-the-art methods for anomaly detection.
arXiv Detail & Related papers (2020-05-04T07:05:51Z) - Gauge Equivariant Mesh CNNs: Anisotropic convolutions on geometric
graphs [81.12344211998635]
A common approach to define convolutions on meshes is to interpret them as a graph and apply graph convolutional networks (GCNs)
We propose Gauge Equivariant Mesh CNNs which generalize GCNs to apply anisotropic gauge equivariant kernels.
Our experiments validate the significantly improved expressivity of the proposed model over conventional GCNs and other methods.
arXiv Detail & Related papers (2020-03-11T17:21:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.