On the Relationships between Transform-Learning NMF and
Joint-Diagonalization
- URL: http://arxiv.org/abs/2112.05664v1
- Date: Fri, 10 Dec 2021 16:52:15 GMT
- Title: On the Relationships between Transform-Learning NMF and
Joint-Diagonalization
- Authors: Sixin Zhang, Emmanuel Soubies, and C\'edric F\'evotte
- Abstract summary: Non-negative matrix factorization with transform learning (TL-NMF) is a recent idea that aims at learning data representations suited to NMF.
We show that, when the number of data realizations is sufficiently large, TL-NMF can be replaced by a two-step approach.
- Score: 5.155159655787271
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Non-negative matrix factorization with transform learning (TL-NMF) is a
recent idea that aims at learning data representations suited to NMF. In this
work, we relate TL-NMF to the classical matrix joint-diagonalization (JD)
problem. We show that, when the number of data realizations is sufficiently
large, TL-NMF can be replaced by a two-step approach -- termed as JD+NMF --
that estimates the transform through JD, prior to NMF computation. In contrast,
we found that when the number of data realizations is limited, not only is
JD+NMF no longer equivalent to TL-NMF, but the inherent low-rank constraint of
TL-NMF turns out to be an essential ingredient to learn meaningful transforms
for NMF.
Related papers
- Beyond Low-rankness: Guaranteed Matrix Recovery via Modified Nuclear Norm [52.00038315973684]
The nuclear norm (NN) has been widely explored in matrix recovery problems, such as Robust PCA and matrix completion.<n>We introduce a new modified nuclear norm (MNN) framework, where the MNN family norms are defined by adopting suitable transformations and performing the NN on the transformed matrix.
arXiv Detail & Related papers (2025-07-24T11:53:55Z) - Stratified Non-Negative Tensor Factorization [45.439685980328605]
Stratified-NTF can identify interpretable topics with lower memory requirements than Stratified-NMF.
We develop a multiplicative update rule and demonstrate the method on text and image data.
arXiv Detail & Related papers (2024-11-27T23:16:00Z) - GSVD-NMF: Recovering Missing Features in Non-negative Matrix Factorization [0.0]
We introduce GSVD-NMF, which proposes new components based on the generalized singular value decomposition (GSVD) between preliminary NMF results and the SVD of the original matrix.
We show that GSVD-NMF often recovers missing features from under-complete NMF and helps NMF achieve better local optima.
arXiv Detail & Related papers (2024-08-15T17:01:00Z) - Stratified-NMF for Heterogeneous Data [8.174199227297514]
We propose a modified NMF objective, Stratified-NMF, that simultaneously learns strata-dependent statistics and a shared topics matrix.
We apply our method to three real world datasets and empirically investigate their learned features.
arXiv Detail & Related papers (2023-11-17T00:34:41Z) - SymNMF-Net for The Symmetric NMF Problem [62.44067422984995]
We propose a neural network called SymNMF-Net for the Symmetric NMF problem.
We show that the inference of each block corresponds to a single iteration of the optimization.
Empirical results on real-world datasets demonstrate the superiority of our SymNMF-Net.
arXiv Detail & Related papers (2022-05-26T08:17:39Z) - Log-based Sparse Nonnegative Matrix Factorization for Data
Representation [55.72494900138061]
Nonnegative matrix factorization (NMF) has been widely studied in recent years due to its effectiveness in representing nonnegative data with parts-based representations.
We propose a new NMF method with log-norm imposed on the factor matrices to enhance the sparseness.
A novel column-wisely sparse norm, named $ell_2,log$-(pseudo) norm, is proposed to enhance the robustness of the proposed method.
arXiv Detail & Related papers (2022-04-22T11:38:10Z) - Co-Separable Nonnegative Matrix Factorization [20.550794776914508]
Nonnegative matrix factorization (NMF) is a popular model in the field of pattern recognition.
We refer to this NMF as a Co-Separable NMF (CoS-NMF)
An optimization model for CoS-NMF is proposed and alternated fast gradient method is employed to solve the model.
arXiv Detail & Related papers (2021-09-02T07:05:04Z) - Feature Weighted Non-negative Matrix Factorization [92.45013716097753]
We propose the Feature weighted Non-negative Matrix Factorization (FNMF) in this paper.
FNMF learns the weights of features adaptively according to their importances.
It can be solved efficiently with the suggested optimization algorithm.
arXiv Detail & Related papers (2021-03-24T21:17:17Z) - Entropy Minimizing Matrix Factorization [102.26446204624885]
Nonnegative Matrix Factorization (NMF) is a widely-used data analysis technique, and has yielded impressive results in many real-world tasks.
In this study, an Entropy Minimizing Matrix Factorization framework (EMMF) is developed to tackle the above problem.
Considering that the outliers are usually much less than the normal samples, a new entropy loss function is established for matrix factorization.
arXiv Detail & Related papers (2021-03-24T21:08:43Z) - Self-supervised Symmetric Nonnegative Matrix Factorization [82.59905231819685]
Symmetric nonnegative factor matrix (SNMF) has demonstrated to be a powerful method for data clustering.
Inspired by ensemble clustering that aims to seek better clustering results, we propose self-supervised SNMF (S$3$NMF)
We take advantage of the sensitivity to code characteristic of SNMF, without relying on any additional information.
arXiv Detail & Related papers (2021-03-02T12:47:40Z) - Fast and Secure Distributed Nonnegative Matrix Factorization [13.672004396034856]
Nonnegative matrix factorization (NMF) has been successfully applied in several data mining tasks.
We study the acceleration and security problems of distributed NMF.
arXiv Detail & Related papers (2020-09-07T01:12:20Z) - Positive Semidefinite Matrix Factorization: A Connection with Phase
Retrieval and Affine Rank Minimization [71.57324258813674]
We show that PSDMF algorithms can be designed based on phase retrieval (PR) and affine rank minimization (ARM) algorithms.
Motivated by this idea, we introduce a new family of PSDMF algorithms based on iterative hard thresholding (IHT)
arXiv Detail & Related papers (2020-07-24T06:10:19Z) - Sparse Separable Nonnegative Matrix Factorization [22.679160149512377]
We propose a new variant of nonnegative matrix factorization (NMF)
Separability requires that the columns of the first NMF factor are equal to columns of the input matrix, while sparsity requires that the columns of the second NMF factor are sparse.
We prove that, in noiseless settings and under mild assumptions, our algorithm recovers the true underlying sources.
arXiv Detail & Related papers (2020-06-13T03:52:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.