Adaptive Weighted Nonnegative Matrix Factorization for Robust Feature
Representation
- URL: http://arxiv.org/abs/2206.03020v1
- Date: Tue, 7 Jun 2022 05:27:08 GMT
- Title: Adaptive Weighted Nonnegative Matrix Factorization for Robust Feature
Representation
- Authors: Tingting Shen, Junhang Li, Can Tong, Qiang He, Chen Li, Yudong Yao,
Yueyang Teng
- Abstract summary: Nonnegative matrix factorization (NMF) has been widely used to dimensionality reduction in machine learning.
Traditional NMF does not properly handle outliers, so that it is sensitive to noise.
This paper proposes an adaptive weighted NMF, which introduces weights to emphasize the different importance of each data point.
- Score: 9.844796520630522
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Nonnegative matrix factorization (NMF) has been widely used to dimensionality
reduction in machine learning. However, the traditional NMF does not properly
handle outliers, so that it is sensitive to noise. In order to improve the
robustness of NMF, this paper proposes an adaptive weighted NMF, which
introduces weights to emphasize the different importance of each data point,
thus the algorithmic sensitivity to noisy data is decreased. It is very
different from the existing robust NMFs that use a slow growth similarity
measure. Specifically, two strategies are proposed to achieve this: fuzzier
weighted technique and entropy weighted regularized technique, and both of them
lead to an iterative solution with a simple form. Experimental results showed
that new methods have more robust feature representation on several real
datasets with noise than exsiting methods.
Related papers
- An optimal pairwise merge algorithm improves the quality and consistency of nonnegative matrix factorization [0.0]
Non-negative matrix factorization (NMF) is a key technique for feature extraction and widely used in source separation.
Here we show that some of these weaknesses may be mitigated by performing NMF in a higher-dimensional feature space.
Experimental results demonstrate our method helps non-ideal NMF solutions escape to better local optima.
arXiv Detail & Related papers (2024-08-16T20:43:42Z) - Coseparable Nonnegative Tensor Factorization With T-CUR Decomposition [2.013220890731494]
Nonnegative Matrix Factorization (NMF) is an important unsupervised learning method to extract meaningful features from data.
In this work, we provide an alternating selection method to select the coseparable core.
The results demonstrate the efficiency of coseparable NTF when compared to coseparable NMF.
arXiv Detail & Related papers (2024-01-30T09:22:37Z) - Contaminated Images Recovery by Implementing Non-negative Matrix
Factorisation [0.0]
We theoretically examine the robustness of the traditional NMF, HCNMF, and L2,1-NMF algorithms and execute sets of experiments to demonstrate the robustness on ORL and Extended YaleB datasets.
Due to the computational cost of these approaches, our final models, such as the HCNMF and L2,1-NMF model, fail to converge within the parameters of this work.
arXiv Detail & Related papers (2022-11-08T13:50:27Z) - Supervised Class-pairwise NMF for Data Representation and Classification [2.7320863258816512]
Non-negative Matrix factorization (NMF) based methods add new terms to the cost function to adapt the model to specific tasks.
NMF method adopts unsupervised approaches to estimate the factorizing matrices.
arXiv Detail & Related papers (2022-09-28T04:33:03Z) - Unitary Approximate Message Passing for Matrix Factorization [90.84906091118084]
We consider matrix factorization (MF) with certain constraints, which finds wide applications in various areas.
We develop a Bayesian approach to MF with an efficient message passing implementation, called UAMPMF.
We show that UAMPMF significantly outperforms state-of-the-art algorithms in terms of recovery accuracy, robustness and computational complexity.
arXiv Detail & Related papers (2022-07-31T12:09:32Z) - Log-based Sparse Nonnegative Matrix Factorization for Data
Representation [55.72494900138061]
Nonnegative matrix factorization (NMF) has been widely studied in recent years due to its effectiveness in representing nonnegative data with parts-based representations.
We propose a new NMF method with log-norm imposed on the factor matrices to enhance the sparseness.
A novel column-wisely sparse norm, named $ell_2,log$-(pseudo) norm, is proposed to enhance the robustness of the proposed method.
arXiv Detail & Related papers (2022-04-22T11:38:10Z) - Feature Weighted Non-negative Matrix Factorization [92.45013716097753]
We propose the Feature weighted Non-negative Matrix Factorization (FNMF) in this paper.
FNMF learns the weights of features adaptively according to their importances.
It can be solved efficiently with the suggested optimization algorithm.
arXiv Detail & Related papers (2021-03-24T21:17:17Z) - Entropy Minimizing Matrix Factorization [102.26446204624885]
Nonnegative Matrix Factorization (NMF) is a widely-used data analysis technique, and has yielded impressive results in many real-world tasks.
In this study, an Entropy Minimizing Matrix Factorization framework (EMMF) is developed to tackle the above problem.
Considering that the outliers are usually much less than the normal samples, a new entropy loss function is established for matrix factorization.
arXiv Detail & Related papers (2021-03-24T21:08:43Z) - Self-supervised Symmetric Nonnegative Matrix Factorization [82.59905231819685]
Symmetric nonnegative factor matrix (SNMF) has demonstrated to be a powerful method for data clustering.
Inspired by ensemble clustering that aims to seek better clustering results, we propose self-supervised SNMF (S$3$NMF)
We take advantage of the sensitivity to code characteristic of SNMF, without relying on any additional information.
arXiv Detail & Related papers (2021-03-02T12:47:40Z) - Positive Semidefinite Matrix Factorization: A Connection with Phase
Retrieval and Affine Rank Minimization [71.57324258813674]
We show that PSDMF algorithms can be designed based on phase retrieval (PR) and affine rank minimization (ARM) algorithms.
Motivated by this idea, we introduce a new family of PSDMF algorithms based on iterative hard thresholding (IHT)
arXiv Detail & Related papers (2020-07-24T06:10:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.