The Tensor as an Informational Resource
- URL: http://arxiv.org/abs/2311.02190v2
- Date: Tue, 17 Sep 2024 06:56:57 GMT
- Title: The Tensor as an Informational Resource
- Authors: Matthias Christandl,
- Abstract summary: A tensor is a multidimensional array of numbers that can be used to store data, encode a computational relation and represent quantum entanglement.
We propose a family of information-theoretically constructed preorders on tensors, which can be used to compare tensors with each other and to assess the existence of transformations between them.
- Score: 1.3044677039636754
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A tensor is a multidimensional array of numbers that can be used to store data, encode a computational relation and represent quantum entanglement. In this sense a tensor can be viewed as valuable resource whose transformation can lead to an understanding of structure in data, computational complexity and quantum information. In order to facilitate the understanding of this resource, we propose a family of information-theoretically constructed preorders on tensors, which can be used to compare tensors with each other and to assess the existence of transformations between them. The construction places copies of a given tensor at the edges of a hypergraph and allows transformations at the vertices. A preorder is then induced by the transformations possible in a given growing sequence of hypergraphs. The new family of preorders generalises the asymptotic restriction preorder which Strassen defined in order to study the computational complexity of matrix multiplication. We derive general properties of the preorders and their associated asymptotic notions of tensor rank and view recent results on tensor rank non-additivity, tensor networks and algebraic complexity in this unifying frame. We hope that this work will provide a useful vantage point for exploring tensors in applied mathematics, physics and computer science, but also from a purely mathematical point of view.
Related papers
- Tensor cumulants for statistical inference on invariant distributions [49.80012009682584]
We show that PCA becomes computationally hard at a critical value of the signal's magnitude.
We define a new set of objects, which provide an explicit, near-orthogonal basis for invariants of a given degree.
It also lets us analyze a new problem of distinguishing between different ensembles.
arXiv Detail & Related papers (2024-04-29T14:33:24Z) - An introduction to graphical tensor notation for mechanistic
interpretability [0.0]
It's often easy to get confused about which operations are happening between tensors.
The first half of this document introduces the notation and applies it to some decompositions.
The second half applies it to some existing some foundational approaches for mechanistically understanding language models.
arXiv Detail & Related papers (2024-02-02T02:56:01Z) - Provable Tensor Completion with Graph Information [49.08648842312456]
We introduce a novel model, theory, and algorithm for solving the dynamic graph regularized tensor completion problem.
We develop a comprehensive model simultaneously capturing the low-rank and similarity structure of the tensor.
In terms of theory, we showcase the alignment between the proposed graph smoothness regularization and a weighted tensor nuclear norm.
arXiv Detail & Related papers (2023-10-04T02:55:10Z) - A Gap in the Subrank of Tensors [2.7992435001846827]
The subrank of tensors is a measure of how much a tensor can be ''diagonalized''
We prove that there is a gap in the subrank when taking large powers under the tensor product.
We also prove that there is a second gap in the possible rates of growth.
arXiv Detail & Related papers (2022-12-03T18:38:28Z) - Low-Rank Tensor Function Representation for Multi-Dimensional Data
Recovery [52.21846313876592]
Low-rank tensor function representation (LRTFR) can continuously represent data beyond meshgrid with infinite resolution.
We develop two fundamental concepts for tensor functions, i.e., the tensor function rank and low-rank tensor function factorization.
Our method substantiates the superiority and versatility of our method as compared with state-of-the-art methods.
arXiv Detail & Related papers (2022-12-01T04:00:38Z) - Near-Linear Time and Fixed-Parameter Tractable Algorithms for Tensor
Decompositions [51.19236668224547]
We study low rank approximation of tensors, focusing on the tensor train and Tucker decompositions.
For tensor train decomposition, we give a bicriteria $(1 + eps)$-approximation algorithm with a small bicriteria rank and $O(q cdot nnz(A))$ running time.
In addition, we extend our algorithm to tensor networks with arbitrary graphs.
arXiv Detail & Related papers (2022-07-15T11:55:09Z) - Graph Kernel Neural Networks [53.91024360329517]
We propose to use graph kernels, i.e. kernel functions that compute an inner product on graphs, to extend the standard convolution operator to the graph domain.
This allows us to define an entirely structural model that does not require computing the embedding of the input graph.
Our architecture allows to plug-in any type of graph kernels and has the added benefit of providing some interpretability.
arXiv Detail & Related papers (2021-12-14T14:48:08Z) - Distributed Non-Negative Tensor Train Decomposition [3.2264685979617655]
High-dimensional data is presented as multidimensional arrays, aka tensors.
The presence of latent (not directly observable) structures in the tensor allows a unique representation and compression of the data.
We introduce a distributed non-negative tensor-train and demonstrate its scalability and the compression on synthetic and real-world big datasets.
arXiv Detail & Related papers (2020-08-04T05:35:57Z) - Seq2Tens: An Efficient Representation of Sequences by Low-Rank Tensor
Projections [11.580603875423408]
Sequential data such as time series, video, or text can be challenging to analyse.
At the heart of this is non-commutativity, in the sense that reordering the elements of a sequence can completely change its meaning.
We use a classical mathematical object -- the tensor algebra -- to capture such dependencies.
arXiv Detail & Related papers (2020-06-12T09:24:35Z) - Spectral Learning on Matrices and Tensors [74.88243719463053]
We show that tensor decomposition can pick up latent effects that are missed by matrix methods.
We also outline computational techniques to design efficient tensor decomposition methods.
arXiv Detail & Related papers (2020-04-16T22:53:00Z) - Quantum Machine Learning Algorithm for Knowledge Graphs [35.149125599812706]
Implicit knowledge can be inferred by modeling and reconstructing the tensor representations generated from knowledge graphs.
This paper investigates how quantum resources can be capitalized to accelerate the modeling of knowledge graphs.
arXiv Detail & Related papers (2020-01-04T13:26:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.