Generalized Visual Information Analysis via Tensorial Algebra
- URL: http://arxiv.org/abs/2001.11708v2
- Date: Sun, 17 Jan 2021 10:58:56 GMT
- Title: Generalized Visual Information Analysis via Tensorial Algebra
- Authors: Liang Liao and Stephen John Maybank
- Abstract summary: Higher order data is modeled using matrices whose entries are numerical arrays of a fixed size.
Matrices with elements in the ring of t-scalars are referred to as t-matrices.
With the t-matrix model, it is possible to generalize many well-known matrix algorithms.
- Score: 7.028302194243312
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Higher order data is modeled using matrices whose entries are numerical
arrays of a fixed size. These arrays, called t-scalars, form a commutative ring
under the convolution product. Matrices with elements in the ring of t-scalars
are referred to as t-matrices. The t-matrices can be scaled, added and
multiplied in the usual way. There are t-matrix generalizations of positive
matrices, orthogonal matrices and Hermitian symmetric matrices. With the
t-matrix model, it is possible to generalize many well-known matrix algorithms.
In particular, the t-matrices are used to generalize the SVD (Singular Value
Decomposition), HOSVD (High Order SVD), PCA (Principal Component Analysis),
2DPCA (Two Dimensional PCA) and GCA (Grassmannian Component Analysis). The
generalized t-matrix algorithms, namely TSVD, THOSVD,TPCA, T2DPCA and TGCA, are
applied to low-rank approximation, reconstruction,and supervised classification
of images. Experiments show that the t-matrix algorithms compare favorably with
standard matrix algorithms.
Related papers
- Large-scale gradient-based training of Mixtures of Factor Analyzers [67.21722742907981]
This article contributes both a theoretical analysis as well as a new method for efficient high-dimensional training by gradient descent.
We prove that MFA training and inference/sampling can be performed based on precision matrices, which does not require matrix inversions after training is completed.
Besides the theoretical analysis and matrices, we apply MFA to typical image datasets such as SVHN and MNIST, and demonstrate the ability to perform sample generation and outlier detection.
arXiv Detail & Related papers (2023-08-26T06:12:33Z) - Color Image Recovery Using Generalized Matrix Completion over
Higher-Order Finite Dimensional Algebra [10.10849889917933]
We extend the traditional second-order matrix model to a more comprehensive higher-order matrix equivalent, called the "t-matrix" model.
This "t-matrix" model is then used to extend some commonly used matrix and tensor completion algorithms to their higher-order versions.
arXiv Detail & Related papers (2023-08-04T15:06:53Z) - Sufficient dimension reduction for feature matrices [3.04585143845864]
We propose a method called principal support matrix machine (PSMM) for the matrix sufficient dimension reduction.
Our numerical analysis demonstrates that the PSMM outperforms existing methods and has strong interpretability in real data applications.
arXiv Detail & Related papers (2023-03-07T23:16:46Z) - Multiresolution kernel matrix algebra [0.0]
We show the compression of kernel matrices by means of samplets produces optimally sparse matrices in a certain S-format.
The inverse of a kernel matrix (if it exists) is compressible in the S-format as well.
The matrix algebra is justified mathematically by pseudo differential calculus.
arXiv Detail & Related papers (2022-11-21T17:50:22Z) - Quantum algorithms for matrix operations and linear systems of equations [65.62256987706128]
We propose quantum algorithms for matrix operations using the "Sender-Receiver" model.
These quantum protocols can be used as subroutines in other quantum schemes.
arXiv Detail & Related papers (2022-02-10T08:12:20Z) - Approximation of Images via Generalized Higher Order Singular Value
Decomposition over Finite-dimensional Commutative Semisimple Algebra [18.144916444749473]
We consider the problem of generalizing HOSVD over a finite dimensional commutative algebra.
Experiments on publicly available images show that the generalized algorithm over t-scalars, namely THOSVD, compares favorably with its canonical counterparts.
arXiv Detail & Related papers (2022-02-01T15:01:12Z) - Sublinear Time Approximation of Text Similarity Matrices [50.73398637380375]
We introduce a generalization of the popular Nystr"om method to the indefinite setting.
Our algorithm can be applied to any similarity matrix and runs in sublinear time in the size of the matrix.
We show that our method, along with a simple variant of CUR decomposition, performs very well in approximating a variety of similarity matrices.
arXiv Detail & Related papers (2021-12-17T17:04:34Z) - Robust 1-bit Compressive Sensing with Partial Gaussian Circulant
Matrices and Generative Priors [54.936314353063494]
We provide recovery guarantees for a correlation-based optimization algorithm for robust 1-bit compressive sensing.
We make use of a practical iterative algorithm, and perform numerical experiments on image datasets to corroborate our results.
arXiv Detail & Related papers (2021-08-08T05:28:06Z) - Non-PSD Matrix Sketching with Applications to Regression and
Optimization [56.730993511802865]
We present dimensionality reduction methods for non-PSD and square-roots" matrices.
We show how these techniques can be used for multiple downstream tasks.
arXiv Detail & Related papers (2021-06-16T04:07:48Z) - Optimal Iterative Sketching with the Subsampled Randomized Hadamard
Transform [64.90148466525754]
We study the performance of iterative sketching for least-squares problems.
We show that the convergence rate for Haar and randomized Hadamard matrices are identical, andally improve upon random projections.
These techniques may be applied to other algorithms that employ randomized dimension reduction.
arXiv Detail & Related papers (2020-02-03T16:17:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.