Co-Separable Nonnegative Matrix Factorization
- URL: http://arxiv.org/abs/2109.00749v1
- Date: Thu, 2 Sep 2021 07:05:04 GMT
- Title: Co-Separable Nonnegative Matrix Factorization
- Authors: Junjun Pan and Michael K. Ng
- Abstract summary: Nonnegative matrix factorization (NMF) is a popular model in the field of pattern recognition.
We refer to this NMF as a Co-Separable NMF (CoS-NMF)
An optimization model for CoS-NMF is proposed and alternated fast gradient method is employed to solve the model.
- Score: 20.550794776914508
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Nonnegative matrix factorization (NMF) is a popular model in the field of
pattern recognition. It aims to find a low rank approximation for nonnegative
data M by a product of two nonnegative matrices W and H. In general, NMF is
NP-hard to solve while it can be solved efficiently under separability
assumption, which requires the columns of factor matrix are equal to columns of
the input matrix. In this paper, we generalize separability assumption based on
3-factor NMF M=P_1SP_2, and require that S is a sub-matrix of the input matrix.
We refer to this NMF as a Co-Separable NMF (CoS-NMF). We discuss some
mathematics properties of CoS-NMF, and present the relationships with other
related matrix factorizations such as CUR decomposition, generalized separable
NMF(GS-NMF), and bi-orthogonal tri-factorization (BiOR-NM3F). An optimization
model for CoS-NMF is proposed and alternated fast gradient method is employed
to solve the model. Numerical experiments on synthetic datasets, document
datasets and facial databases are conducted to verify the effectiveness of our
CoS-NMF model. Compared to state-of-the-art methods, CoS-NMF model performs
very well in co-clustering task, and preserves a good approximation to the
input data matrix as well.
Related papers
- Coseparable Nonnegative Tensor Factorization With T-CUR Decomposition [2.013220890731494]
Nonnegative Matrix Factorization (NMF) is an important unsupervised learning method to extract meaningful features from data.
In this work, we provide an alternating selection method to select the coseparable core.
The results demonstrate the efficiency of coseparable NTF when compared to coseparable NMF.
arXiv Detail & Related papers (2024-01-30T09:22:37Z) - Stratified-NMF for Heterogeneous Data [8.174199227297514]
We propose a modified NMF objective, Stratified-NMF, that simultaneously learns strata-dependent statistics and a shared topics matrix.
We apply our method to three real world datasets and empirically investigate their learned features.
arXiv Detail & Related papers (2023-11-17T00:34:41Z) - Contaminated Images Recovery by Implementing Non-negative Matrix
Factorisation [0.0]
We theoretically examine the robustness of the traditional NMF, HCNMF, and L2,1-NMF algorithms and execute sets of experiments to demonstrate the robustness on ORL and Extended YaleB datasets.
Due to the computational cost of these approaches, our final models, such as the HCNMF and L2,1-NMF model, fail to converge within the parameters of this work.
arXiv Detail & Related papers (2022-11-08T13:50:27Z) - Unitary Approximate Message Passing for Matrix Factorization [90.84906091118084]
We consider matrix factorization (MF) with certain constraints, which finds wide applications in various areas.
We develop a Bayesian approach to MF with an efficient message passing implementation, called UAMPMF.
We show that UAMPMF significantly outperforms state-of-the-art algorithms in terms of recovery accuracy, robustness and computational complexity.
arXiv Detail & Related papers (2022-07-31T12:09:32Z) - Log-based Sparse Nonnegative Matrix Factorization for Data
Representation [55.72494900138061]
Nonnegative matrix factorization (NMF) has been widely studied in recent years due to its effectiveness in representing nonnegative data with parts-based representations.
We propose a new NMF method with log-norm imposed on the factor matrices to enhance the sparseness.
A novel column-wisely sparse norm, named $ell_2,log$-(pseudo) norm, is proposed to enhance the robustness of the proposed method.
arXiv Detail & Related papers (2022-04-22T11:38:10Z) - Fast Rank-1 NMF for Missing Data with KL Divergence [8.020742121274417]
A1GM minimizes the KL divergence from an input matrix to the reconstructed rank-1 matrix.
We show that A1GM is more efficient than a gradient method with competitive reconstruction errors.
arXiv Detail & Related papers (2021-10-25T02:05:35Z) - Feature Weighted Non-negative Matrix Factorization [92.45013716097753]
We propose the Feature weighted Non-negative Matrix Factorization (FNMF) in this paper.
FNMF learns the weights of features adaptively according to their importances.
It can be solved efficiently with the suggested optimization algorithm.
arXiv Detail & Related papers (2021-03-24T21:17:17Z) - Entropy Minimizing Matrix Factorization [102.26446204624885]
Nonnegative Matrix Factorization (NMF) is a widely-used data analysis technique, and has yielded impressive results in many real-world tasks.
In this study, an Entropy Minimizing Matrix Factorization framework (EMMF) is developed to tackle the above problem.
Considering that the outliers are usually much less than the normal samples, a new entropy loss function is established for matrix factorization.
arXiv Detail & Related papers (2021-03-24T21:08:43Z) - Self-supervised Symmetric Nonnegative Matrix Factorization [82.59905231819685]
Symmetric nonnegative factor matrix (SNMF) has demonstrated to be a powerful method for data clustering.
Inspired by ensemble clustering that aims to seek better clustering results, we propose self-supervised SNMF (S$3$NMF)
We take advantage of the sensitivity to code characteristic of SNMF, without relying on any additional information.
arXiv Detail & Related papers (2021-03-02T12:47:40Z) - Positive Semidefinite Matrix Factorization: A Connection with Phase
Retrieval and Affine Rank Minimization [71.57324258813674]
We show that PSDMF algorithms can be designed based on phase retrieval (PR) and affine rank minimization (ARM) algorithms.
Motivated by this idea, we introduce a new family of PSDMF algorithms based on iterative hard thresholding (IHT)
arXiv Detail & Related papers (2020-07-24T06:10:19Z) - Sparse Separable Nonnegative Matrix Factorization [22.679160149512377]
We propose a new variant of nonnegative matrix factorization (NMF)
Separability requires that the columns of the first NMF factor are equal to columns of the input matrix, while sparsity requires that the columns of the second NMF factor are sparse.
We prove that, in noiseless settings and under mild assumptions, our algorithm recovers the true underlying sources.
arXiv Detail & Related papers (2020-06-13T03:52:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.