Least-squares methods for nonnegative matrix factorization over rational
functions
- URL: http://arxiv.org/abs/2209.12579v1
- Date: Mon, 26 Sep 2022 10:43:47 GMT
- Title: Least-squares methods for nonnegative matrix factorization over rational
functions
- Authors: C\'ecile Hautecoeur, Lieven De Lathauwer, Nicolas Gillis, Fran\c{c}ois
Glineur
- Abstract summary: We show that R-NMF has an essentially unique factorization unlike NMF.
We present different approaches to solve R-NMF: the R-HANLS, R-ANLS and R-NLS methods.
We show that R-NMF outperforms NMF in various tasks including the recovery of semi-synthetic continuous signals.
- Score: 17.926628472109556
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Nonnegative Matrix Factorization (NMF) models are widely used to recover
linearly mixed nonnegative data. When the data is made of samplings of
continuous signals, the factors in NMF can be constrained to be samples of
nonnegative rational functions, which allow fairly general models; this is
referred to as NMF using rational functions (R-NMF). We first show that, under
mild assumptions, R-NMF has an essentially unique factorization unlike NMF,
which is crucial in applications where ground-truth factors need to be
recovered such as blind source separation problems. Then we present different
approaches to solve R-NMF: the R-HANLS, R-ANLS and R-NLS methods. From our
tests, no method significantly outperforms the others, and a trade-off should
be done between time and accuracy. Indeed, R-HANLS is fast and accurate for
large problems, while R-ANLS is more accurate, but also more resources
demanding, both in time and memory. R-NLS is very accurate but only for small
problems. Moreover, we show that R-NMF outperforms NMF in various tasks
including the recovery of semi-synthetic continuous signals, and a
classification problem of real hyperspectral signals.
Related papers
- RobuNFR: Evaluating the Robustness of Large Language Models on Non-Functional Requirements Aware Code Generation [52.87427601131587]
We propose RobuNFR for evaluating the robustness of LLMs in NFR-aware code generation.
Our experiments show that RobuNFR reveals issues in the tested LLMs when considering NFRs in code generation.
arXiv Detail & Related papers (2025-03-28T20:05:33Z) - Analyzing Single Cell RNA Sequencing with Topological Nonnegative Matrix
Factorization [0.43512163406551996]
Nonnegative matrix factorization (NMF) offers a unique approach due to its meta-gene interpretation of resulting low-dimensional components.
This work introduces two persistent Laplacian regularized NMF methods, namely, topological NMF (TNMF) and robust topological NMF (rTNMF)
By employing a total of 12 datasets, we demonstrate that the proposed TNMF and rTNMF significantly outperform all other NMF-based methods.
arXiv Detail & Related papers (2023-10-24T11:36:41Z) - Contaminated Images Recovery by Implementing Non-negative Matrix
Factorisation [0.0]
We theoretically examine the robustness of the traditional NMF, HCNMF, and L2,1-NMF algorithms and execute sets of experiments to demonstrate the robustness on ORL and Extended YaleB datasets.
Due to the computational cost of these approaches, our final models, such as the HCNMF and L2,1-NMF model, fail to converge within the parameters of this work.
arXiv Detail & Related papers (2022-11-08T13:50:27Z) - Adaptive Weighted Nonnegative Matrix Factorization for Robust Feature
Representation [9.844796520630522]
Nonnegative matrix factorization (NMF) has been widely used to dimensionality reduction in machine learning.
Traditional NMF does not properly handle outliers, so that it is sensitive to noise.
This paper proposes an adaptive weighted NMF, which introduces weights to emphasize the different importance of each data point.
arXiv Detail & Related papers (2022-06-07T05:27:08Z) - SymNMF-Net for The Symmetric NMF Problem [62.44067422984995]
We propose a neural network called SymNMF-Net for the Symmetric NMF problem.
We show that the inference of each block corresponds to a single iteration of the optimization.
Empirical results on real-world datasets demonstrate the superiority of our SymNMF-Net.
arXiv Detail & Related papers (2022-05-26T08:17:39Z) - Log-based Sparse Nonnegative Matrix Factorization for Data
Representation [55.72494900138061]
Nonnegative matrix factorization (NMF) has been widely studied in recent years due to its effectiveness in representing nonnegative data with parts-based representations.
We propose a new NMF method with log-norm imposed on the factor matrices to enhance the sparseness.
A novel column-wisely sparse norm, named $ell_2,log$-(pseudo) norm, is proposed to enhance the robustness of the proposed method.
arXiv Detail & Related papers (2022-04-22T11:38:10Z) - Moser Flow: Divergence-based Generative Modeling on Manifolds [49.04974733536027]
Moser Flow (MF) is a new class of generative models within the family of continuous normalizing flows (CNF)
MF does not require invoking or backpropagating through an ODE solver during training.
We demonstrate for the first time the use of flow models for sampling from general curved surfaces.
arXiv Detail & Related papers (2021-08-18T09:00:24Z) - Feature Weighted Non-negative Matrix Factorization [92.45013716097753]
We propose the Feature weighted Non-negative Matrix Factorization (FNMF) in this paper.
FNMF learns the weights of features adaptively according to their importances.
It can be solved efficiently with the suggested optimization algorithm.
arXiv Detail & Related papers (2021-03-24T21:17:17Z) - Entropy Minimizing Matrix Factorization [102.26446204624885]
Nonnegative Matrix Factorization (NMF) is a widely-used data analysis technique, and has yielded impressive results in many real-world tasks.
In this study, an Entropy Minimizing Matrix Factorization framework (EMMF) is developed to tackle the above problem.
Considering that the outliers are usually much less than the normal samples, a new entropy loss function is established for matrix factorization.
arXiv Detail & Related papers (2021-03-24T21:08:43Z) - Algorithms for Nonnegative Matrix Factorization with the
Kullback-Leibler Divergence [20.671178429005973]
Kullback-Leibler (KL) divergence is one of the most widely used objective function for nonnegative matrix factorization (NMF)
We propose three new algorithms that guarantee the non-increasingness of the objective function.
We conduct extensive numerical experiments to provide a comprehensive picture of the performances of the KL NMF algorithms.
arXiv Detail & Related papers (2020-10-05T11:51:39Z) - Sparse Separable Nonnegative Matrix Factorization [22.679160149512377]
We propose a new variant of nonnegative matrix factorization (NMF)
Separability requires that the columns of the first NMF factor are equal to columns of the input matrix, while sparsity requires that the columns of the second NMF factor are sparse.
We prove that, in noiseless settings and under mild assumptions, our algorithm recovers the true underlying sources.
arXiv Detail & Related papers (2020-06-13T03:52:29Z) - Modal Regression based Structured Low-rank Matrix Recovery for
Multi-view Learning [70.57193072829288]
Low-rank Multi-view Subspace Learning has shown great potential in cross-view classification in recent years.
Existing LMvSL based methods are incapable of well handling view discrepancy and discriminancy simultaneously.
We propose Structured Low-rank Matrix Recovery (SLMR), a unique method of effectively removing view discrepancy and improving discriminancy.
arXiv Detail & Related papers (2020-03-22T03:57:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.