Approximation of Images via Generalized Higher Order Singular Value
Decomposition over Finite-dimensional Commutative Semisimple Algebra
- URL: http://arxiv.org/abs/2202.00450v2
- Date: Thu, 3 Feb 2022 02:48:54 GMT
- Title: Approximation of Images via Generalized Higher Order Singular Value
Decomposition over Finite-dimensional Commutative Semisimple Algebra
- Authors: Liang Liao, Sen Lin, Lun Li, Xiuwei Zhang, Song Zhao, Yan Wang,
Xinqiang Wang, Qi Gao, Jingyu Wang
- Abstract summary: We consider the problem of generalizing HOSVD over a finite dimensional commutative algebra.
Experiments on publicly available images show that the generalized algorithm over t-scalars, namely THOSVD, compares favorably with its canonical counterparts.
- Score: 18.144916444749473
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Low-rank approximation of images via singular value decomposition is
well-received in the era of big data. However, singular value decomposition
(SVD) is only for order-two data, i.e., matrices. It is necessary to flatten a
higher order input into a matrix or break it into a series of order-two slices
to tackle higher order data such as multispectral images and videos with the
SVD. Higher order singular value decomposition (HOSVD) extends the SVD and can
approximate higher order data using sums of a few rank-one components. We
consider the problem of generalizing HOSVD over a finite dimensional
commutative algebra. This algebra, referred to as a t-algebra, generalizes the
field of complex numbers. The elements of the algebra, called t-scalars, are
fix-sized arrays of complex numbers. One can generalize matrices and tensors
over t-scalars and then extend many canonical matrix and tensor algorithms,
including HOSVD, to obtain higher-performance versions. The generalization of
HOSVD is called THOSVD. Its performance of approximating multi-way data can be
further improved by an alternating algorithm. THOSVD also unifies a wide range
of principal component analysis algorithms. To exploit the potential of
generalized algorithms using t-scalars for approximating images, we use a pixel
neighborhood strategy to convert each pixel to "deeper-order" t-scalar.
Experiments on publicly available images show that the generalized algorithm
over t-scalars, namely THOSVD, compares favorably with its canonical
counterparts.
Related papers
- An Efficient Algorithm for Clustered Multi-Task Compressive Sensing [60.70532293880842]
Clustered multi-task compressive sensing is a hierarchical model that solves multiple compressive sensing tasks.
The existing inference algorithm for this model is computationally expensive and does not scale well in high dimensions.
We propose a new algorithm that substantially accelerates model inference by avoiding the need to explicitly compute these covariance matrices.
arXiv Detail & Related papers (2023-09-30T15:57:14Z) - Color Image Recovery Using Generalized Matrix Completion over
Higher-Order Finite Dimensional Algebra [10.10849889917933]
We extend the traditional second-order matrix model to a more comprehensive higher-order matrix equivalent, called the "t-matrix" model.
This "t-matrix" model is then used to extend some commonly used matrix and tensor completion algorithms to their higher-order versions.
arXiv Detail & Related papers (2023-08-04T15:06:53Z) - High-Dimensional Sparse Bayesian Learning without Covariance Matrices [66.60078365202867]
We introduce a new inference scheme that avoids explicit construction of the covariance matrix.
Our approach couples a little-known diagonal estimation result from numerical linear algebra with the conjugate gradient algorithm.
On several simulations, our method scales better than existing approaches in computation time and memory.
arXiv Detail & Related papers (2022-02-25T16:35:26Z) - Sublinear Time Approximation of Text Similarity Matrices [50.73398637380375]
We introduce a generalization of the popular Nystr"om method to the indefinite setting.
Our algorithm can be applied to any similarity matrix and runs in sublinear time in the size of the matrix.
We show that our method, along with a simple variant of CUR decomposition, performs very well in approximating a variety of similarity matrices.
arXiv Detail & Related papers (2021-12-17T17:04:34Z) - Non-PSD Matrix Sketching with Applications to Regression and
Optimization [56.730993511802865]
We present dimensionality reduction methods for non-PSD and square-roots" matrices.
We show how these techniques can be used for multiple downstream tasks.
arXiv Detail & Related papers (2021-06-16T04:07:48Z) - A generalization of the randomized singular value decomposition [2.538209532048867]
We generalize the theory of randomized SVD to multivariable Gaussian vectors, allowing one to incorporate prior knowledge of $A$ into the algorithm.
We construct a new covariance kernel for GPs, based on weighted Jacobi algorithms, which allows us to rapidly sample the GP and control the smoothness of the randomly generated functions.
arXiv Detail & Related papers (2021-05-27T10:39:37Z) - Robust Differentiable SVD [117.35644933471401]
Eigendecomposition of symmetric matrices is at the heart of many computer vision algorithms.
Instability arises in the presence of eigenvalues that are close to each other.
We show that the Taylor expansion of the SVD gradient is theoretically equivalent to the gradient obtained using PI without relying on an iterative process.
arXiv Detail & Related papers (2021-04-08T15:04:15Z) - Fast differentiable evolution of quantum states under Gaussian
transformations [6.737752058029072]
We present a faster algorithm that computes the final state without having to generate the full transformation matrix first.
We benchmark our algorithm by optimizing circuits to produce single photons, Gottesman-Kitaev-Preskill states and NOON states.
arXiv Detail & Related papers (2021-02-10T21:22:19Z) - Ranky : An Approach to Solve Distributed SVD on Large Sparse Matrices [0.0]
Singular Value Decomposition (SVD) is a well studied research topic in many fields and applications.
We propose Ranky, set of methods to solve rank problem on large and sparse matrices in a distributed manner.
arXiv Detail & Related papers (2020-09-21T11:36:28Z) - Optimal Iterative Sketching with the Subsampled Randomized Hadamard
Transform [64.90148466525754]
We study the performance of iterative sketching for least-squares problems.
We show that the convergence rate for Haar and randomized Hadamard matrices are identical, andally improve upon random projections.
These techniques may be applied to other algorithms that employ randomized dimension reduction.
arXiv Detail & Related papers (2020-02-03T16:17:50Z) - Generalized Visual Information Analysis via Tensorial Algebra [7.028302194243312]
Higher order data is modeled using matrices whose entries are numerical arrays of a fixed size.
Matrices with elements in the ring of t-scalars are referred to as t-matrices.
With the t-matrix model, it is possible to generalize many well-known matrix algorithms.
arXiv Detail & Related papers (2020-01-31T08:47:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.