Unitary Approximate Message Passing for Matrix Factorization
- URL: http://arxiv.org/abs/2208.00422v1
- Date: Sun, 31 Jul 2022 12:09:32 GMT
- Title: Unitary Approximate Message Passing for Matrix Factorization
- Authors: Zhengdao Yuan, Qinghua Guo, Yonina C. Eldar, Yonghui Li
- Abstract summary: We consider matrix factorization (MF) with certain constraints, which finds wide applications in various areas.
We develop a Bayesian approach to MF with an efficient message passing implementation, called UAMPMF.
We show that UAMPMF significantly outperforms state-of-the-art algorithms in terms of recovery accuracy, robustness and computational complexity.
- Score: 90.84906091118084
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: We consider matrix factorization (MF) with certain constraints, which finds
wide applications in various areas. Leveraging variational inference (VI) and
unitary approximate message passing (UAMP), we develop a Bayesian approach to
MF with an efficient message passing implementation, called UAMPMF. With proper
priors imposed on the factor matrices, UAMPMF can be used to solve many
problems that can be formulated as MF, such as non negative matrix
factorization, dictionary learning, compressive sensing with matrix
uncertainty, robust principal component analysis, and sparse matrix
factorization. Extensive numerical examples are provided to show that UAMPMF
significantly outperforms state-of-the-art algorithms in terms of recovery
accuracy, robustness and computational complexity.
Related papers
- Exponentially Convergent Algorithms for Supervised Matrix Factorization [2.1485350418225244]
Supervised factorization (SMF) is a machine learning method that converges extraction and classification tasks.
Our paper provides a novel framework that 'lifts' SMF as a low-rank estimation problem in a combined factor space estimation.
arXiv Detail & Related papers (2023-11-18T23:24:02Z) - Large-scale gradient-based training of Mixtures of Factor Analyzers [67.21722742907981]
This article contributes both a theoretical analysis as well as a new method for efficient high-dimensional training by gradient descent.
We prove that MFA training and inference/sampling can be performed based on precision matrices, which does not require matrix inversions after training is completed.
Besides the theoretical analysis and matrices, we apply MFA to typical image datasets such as SVHN and MNIST, and demonstrate the ability to perform sample generation and outlier detection.
arXiv Detail & Related papers (2023-08-26T06:12:33Z) - MFAI: A Scalable Bayesian Matrix Factorization Approach to Leveraging
Auxiliary Information [8.42894516984735]
We propose to integrate gradient boosted trees in the probabilistic matrix factorization framework to leverage auxiliary information (MFAI)
MFAI naturally inherits several salient features of gradient boosted trees, such as the capability of flexibly modeling nonlinear relationships.
MFAI is computationally efficient and scalable to large datasets by exploiting variational inference.
arXiv Detail & Related papers (2023-03-05T03:26:14Z) - Learning Multiresolution Matrix Factorization and its Wavelet Networks
on Graphs [11.256959274636724]
Multiresolution Matrix Factorization (MMF) is unusual amongst fast matrix factorization algorithms.
We propose a learnable version of MMF that carfully optimize the factorization with a combination of reinforcement learning and Stiefel manifold optimization.
We show that the resulting wavelet basis far outperforms prior MMF algorithms and provides the first version of this type of factorization that can be robustly deployed on standard learning tasks.
arXiv Detail & Related papers (2021-11-02T23:14:17Z) - Robust Matrix Factorization with Grouping Effect [28.35582493230616]
We propose a novel method called Matrix Factorization with Grouping effect (GRMF)
The proposed GRMF can learn grouping structure and sparsity in MF without prior knowledge.
Experiments have been conducted using real-world data sets with outliers and contaminated noise.
arXiv Detail & Related papers (2021-06-25T15:03:52Z) - Adversarially-Trained Nonnegative Matrix Factorization [77.34726150561087]
We consider an adversarially-trained version of the nonnegative matrix factorization.
In our formulation, an attacker adds an arbitrary matrix of bounded norm to the given data matrix.
We design efficient algorithms inspired by adversarial training to optimize for dictionary and coefficient matrices.
arXiv Detail & Related papers (2021-04-10T13:13:17Z) - Feature Weighted Non-negative Matrix Factorization [92.45013716097753]
We propose the Feature weighted Non-negative Matrix Factorization (FNMF) in this paper.
FNMF learns the weights of features adaptively according to their importances.
It can be solved efficiently with the suggested optimization algorithm.
arXiv Detail & Related papers (2021-03-24T21:17:17Z) - Entropy Minimizing Matrix Factorization [102.26446204624885]
Nonnegative Matrix Factorization (NMF) is a widely-used data analysis technique, and has yielded impressive results in many real-world tasks.
In this study, an Entropy Minimizing Matrix Factorization framework (EMMF) is developed to tackle the above problem.
Considering that the outliers are usually much less than the normal samples, a new entropy loss function is established for matrix factorization.
arXiv Detail & Related papers (2021-03-24T21:08:43Z) - Positive Semidefinite Matrix Factorization: A Connection with Phase
Retrieval and Affine Rank Minimization [71.57324258813674]
We show that PSDMF algorithms can be designed based on phase retrieval (PR) and affine rank minimization (ARM) algorithms.
Motivated by this idea, we introduce a new family of PSDMF algorithms based on iterative hard thresholding (IHT)
arXiv Detail & Related papers (2020-07-24T06:10:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.