An optimal pairwise merge algorithm improves the quality and consistency of nonnegative matrix factorization
- URL: http://arxiv.org/abs/2408.09013v2
- Date: Mon, 28 Oct 2024 20:05:08 GMT
- Title: An optimal pairwise merge algorithm improves the quality and consistency of nonnegative matrix factorization
- Authors: Youdong Guo, Timothy E. Holy,
- Abstract summary: Non-negative matrix factorization (NMF) is a key technique for feature extraction and widely used in source separation.
Here we show that some of these weaknesses may be mitigated by performing NMF in a higher-dimensional feature space.
Experimental results demonstrate our method helps non-ideal NMF solutions escape to better local optima.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Non-negative matrix factorization (NMF) is a key technique for feature extraction and widely used in source separation. However, existing algorithms may converge to poor local minima, or to one of several minima with similar objective value but differing feature parametrizations. Here we show that some of these weaknesses may be mitigated by performing NMF in a higher-dimensional feature space and then iteratively combining components with an analytically-solvable pairwise merge strategy. Experimental results demonstrate our method helps non-ideal NMF solutions escape to better local optima and achieve greater consistency of the solutions. Despite these extra steps, our approach exhibits similar computational performance to established methods by reducing the occurrence of "plateau phenomenon" near saddle points. Moreover, the results also illustrate that our method is compatible with different NMF algorithms. Thus, this can be recommended as a preferred approach for most applications of NMF.
Related papers
- Optimizing Solution-Samplers for Combinatorial Problems: The Landscape
of Policy-Gradient Methods [52.0617030129699]
We introduce a novel theoretical framework for analyzing the effectiveness of DeepMatching Networks and Reinforcement Learning methods.
Our main contribution holds for a broad class of problems including Max-and Min-Cut, Max-$k$-Bipartite-Bi, Maximum-Weight-Bipartite-Bi, and Traveling Salesman Problem.
As a byproduct of our analysis we introduce a novel regularization process over vanilla descent and provide theoretical and experimental evidence that it helps address vanishing-gradient issues and escape bad stationary points.
arXiv Detail & Related papers (2023-10-08T23:39:38Z) - Linearization Algorithms for Fully Composite Optimization [61.20539085730636]
This paper studies first-order algorithms for solving fully composite optimization problems convex compact sets.
We leverage the structure of the objective by handling differentiable and non-differentiable separately, linearizing only the smooth parts.
arXiv Detail & Related papers (2023-02-24T18:41:48Z) - Contaminated Images Recovery by Implementing Non-negative Matrix
Factorisation [0.0]
We theoretically examine the robustness of the traditional NMF, HCNMF, and L2,1-NMF algorithms and execute sets of experiments to demonstrate the robustness on ORL and Extended YaleB datasets.
Due to the computational cost of these approaches, our final models, such as the HCNMF and L2,1-NMF model, fail to converge within the parameters of this work.
arXiv Detail & Related papers (2022-11-08T13:50:27Z) - Supervised Class-pairwise NMF for Data Representation and Classification [2.7320863258816512]
Non-negative Matrix factorization (NMF) based methods add new terms to the cost function to adapt the model to specific tasks.
NMF method adopts unsupervised approaches to estimate the factorizing matrices.
arXiv Detail & Related papers (2022-09-28T04:33:03Z) - Adaptive Weighted Nonnegative Matrix Factorization for Robust Feature
Representation [9.844796520630522]
Nonnegative matrix factorization (NMF) has been widely used to dimensionality reduction in machine learning.
Traditional NMF does not properly handle outliers, so that it is sensitive to noise.
This paper proposes an adaptive weighted NMF, which introduces weights to emphasize the different importance of each data point.
arXiv Detail & Related papers (2022-06-07T05:27:08Z) - Log-based Sparse Nonnegative Matrix Factorization for Data
Representation [55.72494900138061]
Nonnegative matrix factorization (NMF) has been widely studied in recent years due to its effectiveness in representing nonnegative data with parts-based representations.
We propose a new NMF method with log-norm imposed on the factor matrices to enhance the sparseness.
A novel column-wisely sparse norm, named $ell_2,log$-(pseudo) norm, is proposed to enhance the robustness of the proposed method.
arXiv Detail & Related papers (2022-04-22T11:38:10Z) - Feature Weighted Non-negative Matrix Factorization [92.45013716097753]
We propose the Feature weighted Non-negative Matrix Factorization (FNMF) in this paper.
FNMF learns the weights of features adaptively according to their importances.
It can be solved efficiently with the suggested optimization algorithm.
arXiv Detail & Related papers (2021-03-24T21:17:17Z) - Positive Semidefinite Matrix Factorization: A Connection with Phase
Retrieval and Affine Rank Minimization [71.57324258813674]
We show that PSDMF algorithms can be designed based on phase retrieval (PR) and affine rank minimization (ARM) algorithms.
Motivated by this idea, we introduce a new family of PSDMF algorithms based on iterative hard thresholding (IHT)
arXiv Detail & Related papers (2020-07-24T06:10:19Z) - Follow the bisector: a simple method for multi-objective optimization [65.83318707752385]
We consider optimization problems, where multiple differentiable losses have to be minimized.
The presented method computes descent direction in every iteration to guarantee equal relative decrease of objective functions.
arXiv Detail & Related papers (2020-07-14T09:50:33Z) - Sparse Separable Nonnegative Matrix Factorization [22.679160149512377]
We propose a new variant of nonnegative matrix factorization (NMF)
Separability requires that the columns of the first NMF factor are equal to columns of the input matrix, while sparsity requires that the columns of the second NMF factor are sparse.
We prove that, in noiseless settings and under mild assumptions, our algorithm recovers the true underlying sources.
arXiv Detail & Related papers (2020-06-13T03:52:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.