GSVD-NMF: Recovering Missing Features in Non-negative Matrix Factorization
- URL: http://arxiv.org/abs/2408.08260v2
- Date: Wed, 08 Jan 2025 21:12:48 GMT
- Title: GSVD-NMF: Recovering Missing Features in Non-negative Matrix Factorization
- Authors: Youdong Guo, Timothy E. Holy,
- Abstract summary: Non-negative matrix factorization (NMF) is an important tool in signal processing and widely used to separate mixed sources into their components.<n>Here we introduce GSVD-NMF, a method that proposes new components based on the generalized singular value decomposition (GSVD) to address discrepancies between the initial under-complete NMF results and the SVD of the original matrix.<n> Simulation and experimental results demonstrate that GSVD-NMF often effectively recovers multiple missing components in under-complete NMF, with the recovered NMF solutions frequently reaching better local optima.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Non-negative matrix factorization (NMF) is an important tool in signal processing and widely used to separate mixed sources into their components. Algorithms for NMF require that the user choose the number of components in advance, and if the results are unsatisfying one typically needs to start again with a different number of components. To make NMF more interactive and incremental, here we introduce GSVD-NMF, a method that proposes new components based on the generalized singular value decomposition (GSVD) to address discrepancies between the initial under-complete NMF results and the SVD of the original matrix. Simulation and experimental results demonstrate that GSVD-NMF often effectively recovers multiple missing components in under-complete NMF, with the recovered NMF solutions frequently reaching better local optima. The results further show that GSVD-NMF is compatible with various NMF algorithms and that directly augmenting components is more efficient than rerunning NMF from scratch with additional components. By deliberately starting from under-complete NMF, GSVD-NMF has the potential to be a recommended approach for a range of general NMF applications.
Related papers
- Nonnegative Matrix Factorization in Dimensionality Reduction: A Survey [45.06188379747932]
Dimensionality Reduction plays a pivotal role in improving feature learning accuracy and reducing training time.
Nonnegative Matrix Factorization (NMF) has emerged as a popular and powerful method for dimensionality reduction.
This paper presents a comprehensive survey of NMF, focusing on its applications in both feature extraction and feature selection.
arXiv Detail & Related papers (2024-05-06T16:32:01Z) - Analyzing Single Cell RNA Sequencing with Topological Nonnegative Matrix
Factorization [0.43512163406551996]
Nonnegative matrix factorization (NMF) offers a unique approach due to its meta-gene interpretation of resulting low-dimensional components.
This work introduces two persistent Laplacian regularized NMF methods, namely, topological NMF (TNMF) and robust topological NMF (rTNMF)
By employing a total of 12 datasets, we demonstrate that the proposed TNMF and rTNMF significantly outperform all other NMF-based methods.
arXiv Detail & Related papers (2023-10-24T11:36:41Z) - Least-squares methods for nonnegative matrix factorization over rational
functions [17.926628472109556]
We show that R-NMF has an essentially unique factorization unlike NMF.
We present different approaches to solve R-NMF: the R-HANLS, R-ANLS and R-NLS methods.
We show that R-NMF outperforms NMF in various tasks including the recovery of semi-synthetic continuous signals.
arXiv Detail & Related papers (2022-09-26T10:43:47Z) - Unitary Approximate Message Passing for Matrix Factorization [90.84906091118084]
We consider matrix factorization (MF) with certain constraints, which finds wide applications in various areas.
We develop a Bayesian approach to MF with an efficient message passing implementation, called UAMPMF.
We show that UAMPMF significantly outperforms state-of-the-art algorithms in terms of recovery accuracy, robustness and computational complexity.
arXiv Detail & Related papers (2022-07-31T12:09:32Z) - SymNMF-Net for The Symmetric NMF Problem [62.44067422984995]
We propose a neural network called SymNMF-Net for the Symmetric NMF problem.
We show that the inference of each block corresponds to a single iteration of the optimization.
Empirical results on real-world datasets demonstrate the superiority of our SymNMF-Net.
arXiv Detail & Related papers (2022-05-26T08:17:39Z) - Hyperspectral Unmixing Based on Nonnegative Matrix Factorization: A
Comprehensive Review [25.50091058791411]
Hyperspectral unmixing estimates a set of endmembers and their corresponding abundances from a hyperspectral image.
Nonnegative matrix factorization (NMF) plays an increasingly significant role in solving this problem.
We show how to improve NMF by utilizing the main properties of HSIs.
arXiv Detail & Related papers (2022-05-20T02:48:43Z) - Log-based Sparse Nonnegative Matrix Factorization for Data
Representation [55.72494900138061]
Nonnegative matrix factorization (NMF) has been widely studied in recent years due to its effectiveness in representing nonnegative data with parts-based representations.
We propose a new NMF method with log-norm imposed on the factor matrices to enhance the sparseness.
A novel column-wisely sparse norm, named $ell_2,log$-(pseudo) norm, is proposed to enhance the robustness of the proposed method.
arXiv Detail & Related papers (2022-04-22T11:38:10Z) - On the Relationships between Transform-Learning NMF and
Joint-Diagonalization [5.155159655787271]
Non-negative matrix factorization with transform learning (TL-NMF) is a recent idea that aims at learning data representations suited to NMF.
We show that, when the number of data realizations is sufficiently large, TL-NMF can be replaced by a two-step approach.
arXiv Detail & Related papers (2021-12-10T16:52:15Z) - Fast Rank-1 NMF for Missing Data with KL Divergence [8.020742121274417]
A1GM minimizes the KL divergence from an input matrix to the reconstructed rank-1 matrix.
We show that A1GM is more efficient than a gradient method with competitive reconstruction errors.
arXiv Detail & Related papers (2021-10-25T02:05:35Z) - Co-Separable Nonnegative Matrix Factorization [20.550794776914508]
Nonnegative matrix factorization (NMF) is a popular model in the field of pattern recognition.
We refer to this NMF as a Co-Separable NMF (CoS-NMF)
An optimization model for CoS-NMF is proposed and alternated fast gradient method is employed to solve the model.
arXiv Detail & Related papers (2021-09-02T07:05:04Z) - Feature Weighted Non-negative Matrix Factorization [92.45013716097753]
We propose the Feature weighted Non-negative Matrix Factorization (FNMF) in this paper.
FNMF learns the weights of features adaptively according to their importances.
It can be solved efficiently with the suggested optimization algorithm.
arXiv Detail & Related papers (2021-03-24T21:17:17Z) - Entropy Minimizing Matrix Factorization [102.26446204624885]
Nonnegative Matrix Factorization (NMF) is a widely-used data analysis technique, and has yielded impressive results in many real-world tasks.
In this study, an Entropy Minimizing Matrix Factorization framework (EMMF) is developed to tackle the above problem.
Considering that the outliers are usually much less than the normal samples, a new entropy loss function is established for matrix factorization.
arXiv Detail & Related papers (2021-03-24T21:08:43Z) - Self-supervised Symmetric Nonnegative Matrix Factorization [82.59905231819685]
Symmetric nonnegative factor matrix (SNMF) has demonstrated to be a powerful method for data clustering.
Inspired by ensemble clustering that aims to seek better clustering results, we propose self-supervised SNMF (S$3$NMF)
We take advantage of the sensitivity to code characteristic of SNMF, without relying on any additional information.
arXiv Detail & Related papers (2021-03-02T12:47:40Z) - Positive Semidefinite Matrix Factorization: A Connection with Phase
Retrieval and Affine Rank Minimization [71.57324258813674]
We show that PSDMF algorithms can be designed based on phase retrieval (PR) and affine rank minimization (ARM) algorithms.
Motivated by this idea, we introduce a new family of PSDMF algorithms based on iterative hard thresholding (IHT)
arXiv Detail & Related papers (2020-07-24T06:10:19Z) - Sparse Separable Nonnegative Matrix Factorization [22.679160149512377]
We propose a new variant of nonnegative matrix factorization (NMF)
Separability requires that the columns of the first NMF factor are equal to columns of the input matrix, while sparsity requires that the columns of the second NMF factor are sparse.
We prove that, in noiseless settings and under mild assumptions, our algorithm recovers the true underlying sources.
arXiv Detail & Related papers (2020-06-13T03:52:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.