Very Basics of Tensors with Graphical Notations: Unfolding, Calculations, and Decompositions
- URL: http://arxiv.org/abs/2411.16094v1
- Date: Mon, 25 Nov 2024 05:02:35 GMT
- Title: Very Basics of Tensors with Graphical Notations: Unfolding, Calculations, and Decompositions
- Authors: Tatsuya Yokota,
- Abstract summary: This lecture note is about the basics of tensors and how to represent them in mathematical symbols and graphical notation.
The purpose of this lecture note is to learn the very basics of tensors and how to represent them in mathematical symbols and graphical notation.
- Score: 4.092862870428798
- License:
- Abstract: Tensor network diagram (graphical notation) is a useful tool that graphically represents multiplications between multiple tensors using nodes and edges. Using the graphical notation, complex multiplications between tensors can be described simply and intuitively, and it also helps to understand the essence of tensor products. In fact, most of matrix/tensor products including inner product, outer product, Hadamard product, Kronecker product, and Khatri-Rao product can be written in graphical notation. These matrix/tensor operations are essential building blocks for the use of matrix/tensor decompositions in signal processing and machine learning. The purpose of this lecture note is to learn the very basics of tensors and how to represent them in mathematical symbols and graphical notation. Many papers using tensors omit these detailed definitions and explanations, which can be difficult for the reader. I hope this note will be of help to such readers.
Related papers
- An introduction to graphical tensor notation for mechanistic
interpretability [0.0]
It's often easy to get confused about which operations are happening between tensors.
The first half of this document introduces the notation and applies it to some decompositions.
The second half applies it to some existing some foundational approaches for mechanistically understanding language models.
arXiv Detail & Related papers (2024-02-02T02:56:01Z) - The Tensor as an Informational Resource [1.3044677039636754]
A tensor is a multidimensional array of numbers that can be used to store data, encode a computational relation and represent quantum entanglement.
We propose a family of information-theoretically constructed preorders on tensors, which can be used to compare tensors with each other and to assess the existence of transformations between them.
arXiv Detail & Related papers (2023-11-03T18:47:39Z) - Symbolically integrating tensor networks over various random tensors by
the second version of Python RTNI [0.5439020425818999]
We are upgrading the Python-version of RTNI, which symbolically integrates tensor networks over the Haar-distributed unitary matrices.
Now, PyRTNI2 can treat the Haar-distributed matrices and the real and complex normal Gaussian tensors as well.
In this paper, we explain maths behind the program and show what kind of tensor network calculations can be made with it.
arXiv Detail & Related papers (2023-09-03T13:14:46Z) - Low-Rank Tensor Function Representation for Multi-Dimensional Data
Recovery [52.21846313876592]
Low-rank tensor function representation (LRTFR) can continuously represent data beyond meshgrid with infinite resolution.
We develop two fundamental concepts for tensor functions, i.e., the tensor function rank and low-rank tensor function factorization.
Our method substantiates the superiority and versatility of our method as compared with state-of-the-art methods.
arXiv Detail & Related papers (2022-12-01T04:00:38Z) - A Tutorial on the Spectral Theory of Markov Chains [0.0]
This tutorial provides an in-depth introduction to Markov chains.
We utilize tools from linear algebra and graph theory to describe the transition matrices of different types of Markov chains.
The results presented are relevant to a number of methods in machine learning and data mining.
arXiv Detail & Related papers (2022-07-05T20:43:40Z) - Sign and Basis Invariant Networks for Spectral Graph Representation
Learning [75.18802152811539]
We introduce SignNet and BasisNet -- new neural architectures that are invariant to all requisite symmetries and hence process collections of eigenspaces in a principled manner.
Our networks are theoretically strong for graph representation learning -- they can approximate any spectral graph convolution.
Experiments show the strength of our networks for learning spectral graph filters and learning graph positional encodings.
arXiv Detail & Related papers (2022-02-25T23:11:59Z) - Graph Kernel Neural Networks [53.91024360329517]
We propose to use graph kernels, i.e. kernel functions that compute an inner product on graphs, to extend the standard convolution operator to the graph domain.
This allows us to define an entirely structural model that does not require computing the embedding of the input graph.
Our architecture allows to plug-in any type of graph kernels and has the added benefit of providing some interpretability.
arXiv Detail & Related papers (2021-12-14T14:48:08Z) - Tensor Methods in Computer Vision and Deep Learning [120.3881619902096]
tensors, or multidimensional arrays, are data structures that can naturally represent visual data of multiple dimensions.
With the advent of the deep learning paradigm shift in computer vision, tensors have become even more fundamental.
This article provides an in-depth and practical review of tensors and tensor methods in the context of representation learning and deep learning.
arXiv Detail & Related papers (2021-07-07T18:42:45Z) - Named Tensor Notation [117.30373263410507]
We propose a notation for tensors with named axes.
It relieves the author, reader, and future implementers from the burden of keeping track of the order of axes.
It also makes it easy to extend operations on low-order tensors to higher order ones.
arXiv Detail & Related papers (2021-02-25T22:21:30Z) - A Simple and Efficient Tensor Calculus for Machine Learning [18.23338916563815]
A key concern is the efficiency of evaluating the expressions and their derivatives that hinges on the representation of these expressions.
An algorithm for computing higher order derivatives of tensor expressions like Jacobians or Hessians has been introduced that is a few orders of magnitude faster than previous state-of-the-art approaches.
Here, we show that using Ricci notation is not necessary for an efficient tensor calculus and develop an equally efficient method for the simpler Einstein notation.
arXiv Detail & Related papers (2020-10-07T10:18:56Z) - Spectral Learning on Matrices and Tensors [74.88243719463053]
We show that tensor decomposition can pick up latent effects that are missed by matrix methods.
We also outline computational techniques to design efficient tensor decomposition methods.
arXiv Detail & Related papers (2020-04-16T22:53:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.