Finding Rule-Interpretable Non-Negative Data Representation
- URL: http://arxiv.org/abs/2206.01483v1
- Date: Fri, 3 Jun 2022 10:20:46 GMT
- Title: Finding Rule-Interpretable Non-Negative Data Representation
- Authors: Matej Mihel\v{c}i\'c and Pauli Miettinen
- Abstract summary: We present a version of the NMF approach that merges rule-based descriptions with advantages of part-based representation.
The proposed approach provides numerous advantages in tasks such as focused embedding or performing supervised multi-label NMF.
- Score: 2.817412580574242
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Non-negative Matrix Factorization (NMF) is an intensively used technique for
obtaining parts-based, lower dimensional and non-negative representation of
non-negative data. It is a popular method in different research fields.
Scientists performing research in the fields of biology, medicine and pharmacy
often prefer NMF over other dimensionality reduction approaches (such as PCA)
because the non-negativity of the approach naturally fits the characteristics
of the domain problem and its result is easier to analyze and understand.
Despite these advantages, it still can be hard to get exact characterization
and interpretation of the NMF's resulting latent factors due to their numerical
nature. On the other hand, rule-based approaches are often considered more
interpretable but lack the parts-based interpretation. In this work, we present
a version of the NMF approach that merges rule-based descriptions with
advantages of part-based representation offered by the NMF approach. Given the
numerical input data with non-negative entries and a set of rules with high
entity coverage, the approach creates the lower-dimensional non-negative
representation of the input data in such a way that its factors are described
by the appropriate subset of the input rules. In addition to revealing
important attributes for latent factors, it allows analyzing relations between
these attributes and provides the exact numerical intervals or categorical
values they take. The proposed approach provides numerous advantages in tasks
such as focused embedding or performing supervised multi-label NMF.
Related papers
- Stratified Non-Negative Tensor Factorization [45.439685980328605]
Stratified-NTF can identify interpretable topics with lower memory requirements than Stratified-NMF.
We develop a multiplicative update rule and demonstrate the method on text and image data.
arXiv Detail & Related papers (2024-11-27T23:16:00Z) - Towards a Fairer Non-negative Matrix Factorization [6.069820038869034]
We investigate how Non-negative Matrix Factorization (NMF) can introduce bias in the representation of data groups.
We present an approach, called Fairer-NMF, that seeks to minimize the maximum reconstruction loss for different groups.
arXiv Detail & Related papers (2024-11-14T23:34:38Z) - Organic Priors in Non-Rigid Structure from Motion [102.41675461817177]
This paper advocates the use of organic priors in classical non-rigid structure from motion (NRSfM)
The paper's main contribution is to put forward a simple, methodical, and practical method that can effectively exploit such organic priors to solve NRSfM.
arXiv Detail & Related papers (2022-07-13T15:07:50Z) - Adaptive Weighted Nonnegative Matrix Factorization for Robust Feature
Representation [9.844796520630522]
Nonnegative matrix factorization (NMF) has been widely used to dimensionality reduction in machine learning.
Traditional NMF does not properly handle outliers, so that it is sensitive to noise.
This paper proposes an adaptive weighted NMF, which introduces weights to emphasize the different importance of each data point.
arXiv Detail & Related papers (2022-06-07T05:27:08Z) - Log-based Sparse Nonnegative Matrix Factorization for Data
Representation [55.72494900138061]
Nonnegative matrix factorization (NMF) has been widely studied in recent years due to its effectiveness in representing nonnegative data with parts-based representations.
We propose a new NMF method with log-norm imposed on the factor matrices to enhance the sparseness.
A novel column-wisely sparse norm, named $ell_2,log$-(pseudo) norm, is proposed to enhance the robustness of the proposed method.
arXiv Detail & Related papers (2022-04-22T11:38:10Z) - Initialization for Nonnegative Matrix Factorization: a Comprehensive
Review [0.0]
Non-negative factorization (NMF) has become a popular method for representing meaningful data by extracting a non-negative basis from an non-negative data matrix.
Some numerical results to illustrate the performance of each method are presented.
arXiv Detail & Related papers (2021-09-08T18:49:41Z) - Feature Weighted Non-negative Matrix Factorization [92.45013716097753]
We propose the Feature weighted Non-negative Matrix Factorization (FNMF) in this paper.
FNMF learns the weights of features adaptively according to their importances.
It can be solved efficiently with the suggested optimization algorithm.
arXiv Detail & Related papers (2021-03-24T21:17:17Z) - Entropy Minimizing Matrix Factorization [102.26446204624885]
Nonnegative Matrix Factorization (NMF) is a widely-used data analysis technique, and has yielded impressive results in many real-world tasks.
In this study, an Entropy Minimizing Matrix Factorization framework (EMMF) is developed to tackle the above problem.
Considering that the outliers are usually much less than the normal samples, a new entropy loss function is established for matrix factorization.
arXiv Detail & Related papers (2021-03-24T21:08:43Z) - Algorithms for Nonnegative Matrix Factorization with the
Kullback-Leibler Divergence [20.671178429005973]
Kullback-Leibler (KL) divergence is one of the most widely used objective function for nonnegative matrix factorization (NMF)
We propose three new algorithms that guarantee the non-increasingness of the objective function.
We conduct extensive numerical experiments to provide a comprehensive picture of the performances of the KL NMF algorithms.
arXiv Detail & Related papers (2020-10-05T11:51:39Z) - Learning Likelihoods with Conditional Normalizing Flows [54.60456010771409]
Conditional normalizing flows (CNFs) are efficient in sampling and inference.
We present a study of CNFs where the base density to output space mapping is conditioned on an input x, to model conditional densities p(y|x)
arXiv Detail & Related papers (2019-11-29T19:17:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.