Symbolically integrating tensor networks over various random tensors by
the second version of Python RTNI
- URL: http://arxiv.org/abs/2309.01167v3
- Date: Fri, 15 Sep 2023 09:04:42 GMT
- Title: Symbolically integrating tensor networks over various random tensors by
the second version of Python RTNI
- Authors: Motohisa Fukuda
- Abstract summary: We are upgrading the Python-version of RTNI, which symbolically integrates tensor networks over the Haar-distributed unitary matrices.
Now, PyRTNI2 can treat the Haar-distributed matrices and the real and complex normal Gaussian tensors as well.
In this paper, we explain maths behind the program and show what kind of tensor network calculations can be made with it.
- Score: 0.5439020425818999
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We are upgrading the Python-version of RTNI, which symbolically integrates
tensor networks over the Haar-distributed unitary matrices. Now, PyRTNI2 can
treat the Haar-distributed orthogonal matrices and the real and complex normal
Gaussian tensors as well. Moreover, it can export tensor networks in the format
of TensorNetwork so that one can make further calculations with concrete
tensors, even for low dimensions, where the Weingarten functions differ from
the ones for high dimensions. The tutorial notebooks are found at GitHub:
https://github.com/MotohisaFukuda/PyRTNI2. In this paper, we explain maths
behind the program and show what kind of tensor network calculations can be
made with it. For the former, we interpret the element-wise moment calculus of
the above random matrices and tensors in terms of tensor network diagrams, and
argue that the view is natural, relating delta functions in the calculus to
edges in tensor network diagrams.
Related papers
- An introduction to graphical tensor notation for mechanistic
interpretability [0.0]
It's often easy to get confused about which operations are happening between tensors.
The first half of this document introduces the notation and applies it to some decompositions.
The second half applies it to some existing some foundational approaches for mechanistically understanding language models.
arXiv Detail & Related papers (2024-02-02T02:56:01Z) - TensorKrowch: Smooth integration of tensor networks in machine learning [46.0920431279359]
We introduceKrowch, an open source Python library built on top of PyTorch.
Krowch allows users to construct any tensor network, train it, and integrate it as a layer in more intricate deep learning models.
arXiv Detail & Related papers (2023-06-14T15:55:19Z) - Low-Rank Tensor Function Representation for Multi-Dimensional Data
Recovery [52.21846313876592]
Low-rank tensor function representation (LRTFR) can continuously represent data beyond meshgrid with infinite resolution.
We develop two fundamental concepts for tensor functions, i.e., the tensor function rank and low-rank tensor function factorization.
Our method substantiates the superiority and versatility of our method as compared with state-of-the-art methods.
arXiv Detail & Related papers (2022-12-01T04:00:38Z) - Near-Linear Time and Fixed-Parameter Tractable Algorithms for Tensor
Decompositions [51.19236668224547]
We study low rank approximation of tensors, focusing on the tensor train and Tucker decompositions.
For tensor train decomposition, we give a bicriteria $(1 + eps)$-approximation algorithm with a small bicriteria rank and $O(q cdot nnz(A))$ running time.
In addition, we extend our algorithm to tensor networks with arbitrary graphs.
arXiv Detail & Related papers (2022-07-15T11:55:09Z) - Tensor networks in machine learning [0.0]
A tensor network is a decomposition used to express and approximate large arrays of data.
A merger of tensor networks with machine learning is natural.
Herein the network parameters are adjusted to learn or classify a data-set.
arXiv Detail & Related papers (2022-07-06T18:00:00Z) - Stack operation of tensor networks [10.86105335102537]
We propose a mathematically rigorous definition for the tensor network stack approach.
We illustrate the main ideas with the matrix product states based machine learning as an example.
arXiv Detail & Related papers (2022-03-28T12:45:13Z) - Cherry-Picking Gradients: Learning Low-Rank Embeddings of Visual Data
via Differentiable Cross-Approximation [53.95297550117153]
We propose an end-to-end trainable framework that processes large-scale visual data tensors by looking emphat a fraction of their entries only.
The proposed approach is particularly useful for large-scale multidimensional grid data, and for tasks that require context over a large receptive field.
arXiv Detail & Related papers (2021-05-29T08:39:57Z) - Multi-version Tensor Completion for Time-delayed Spatio-temporal Data [50.762087239885936]
Real-world-temporal data is often incomplete or inaccurate due to various data loading delays.
We propose a low-rank tensor model to predict the updates over time.
We obtain up to 27.2% lower root mean-squared-error compared to the best baseline method.
arXiv Detail & Related papers (2021-05-11T19:55:56Z) - Beyond Lazy Training for Over-parameterized Tensor Decomposition [69.4699995828506]
We show that gradient descent on over-parametrized objective could go beyond the lazy training regime and utilize certain low-rank structure in the data.
Our results show that gradient descent on over-parametrized objective could go beyond the lazy training regime and utilize certain low-rank structure in the data.
arXiv Detail & Related papers (2020-10-22T00:32:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.