Positive Semidefinite Matrix Factorization: A Connection with Phase
Retrieval and Affine Rank Minimization
- URL: http://arxiv.org/abs/2007.12364v2
- Date: Fri, 2 Apr 2021 21:26:06 GMT
- Title: Positive Semidefinite Matrix Factorization: A Connection with Phase
Retrieval and Affine Rank Minimization
- Authors: Dana Lahat, Yanbin Lang, Vincent Y. F. Tan, C\'edric F\'evotte
- Abstract summary: We show that PSDMF algorithms can be designed based on phase retrieval (PR) and affine rank minimization (ARM) algorithms.
Motivated by this idea, we introduce a new family of PSDMF algorithms based on iterative hard thresholding (IHT)
- Score: 71.57324258813674
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Positive semidefinite matrix factorization (PSDMF) expresses each entry of a
nonnegative matrix as the inner product of two positive semidefinite (psd)
matrices. When all these psd matrices are constrained to be diagonal, this
model is equivalent to nonnegative matrix factorization. Applications include
combinatorial optimization, quantum-based statistical models, and recommender
systems, among others. However, despite the increasing interest in PSDMF, only
a few PSDMF algorithms were proposed in the literature. In this work, we
provide a collection of tools for PSDMF, by showing that PSDMF algorithms can
be designed based on phase retrieval (PR) and affine rank minimization (ARM)
algorithms. This procedure allows a shortcut in designing new PSDMF algorithms,
as it allows to leverage some of the useful numerical properties of existing PR
and ARM methods to the PSDMF framework. Motivated by this idea, we introduce a
new family of PSDMF algorithms based on iterative hard thresholding (IHT). This
family subsumes previously-proposed projected gradient PSDMF methods. We show
that there is high variability among PSDMF optimization problems that makes it
beneficial to try a number of methods based on different principles to tackle
difficult problems. In certain cases, our proposed methods are the only
algorithms able to find a solution. In certain other cases, they converge
faster. Our results support our claim that the PSDMF framework can inherit
desired numerical properties from PR and ARM algorithms, leading to more
efficient PSDMF algorithms, and motivate further study of the links between
these models.
Related papers
- Q-learning for Quantile MDPs: A Decomposition, Performance, and Convergence Analysis [30.713243690224207]
In Markov decision processes (MDPs), quantile risk measures such as Value-at-Risk are a standard metric for modeling RL agents' preferences for certain outcomes.
This paper proposes a new Q-learning algorithm for quantile optimization in MDPs with strong convergence and performance guarantees.
arXiv Detail & Related papers (2024-10-31T16:53:20Z) - An optimal pairwise merge algorithm improves the quality and consistency of nonnegative matrix factorization [0.0]
Non-negative matrix factorization (NMF) is a key technique for feature extraction and widely used in source separation.
Here we show that some of these weaknesses may be mitigated by performing NMF in a higher-dimensional feature space.
Experimental results demonstrate our method helps non-ideal NMF solutions escape to better local optima.
arXiv Detail & Related papers (2024-08-16T20:43:42Z) - Spectral Entry-wise Matrix Estimation for Low-Rank Reinforcement
Learning [53.445068584013896]
We study matrix estimation problems arising in reinforcement learning (RL) with low-rank structure.
In low-rank bandits, the matrix to be recovered specifies the expected arm rewards, and for low-rank Markov Decision Processes (MDPs), it may for example characterize the transition kernel of the MDP.
We show that simple spectral-based matrix estimation approaches efficiently recover the singular subspaces of the matrix and exhibit nearly-minimal entry-wise error.
arXiv Detail & Related papers (2023-10-10T17:06:41Z) - Provably Efficient UCB-type Algorithms For Learning Predictive State
Representations [55.00359893021461]
The sequential decision-making problem is statistically learnable if it admits a low-rank structure modeled by predictive state representations (PSRs)
This paper proposes the first known UCB-type approach for PSRs, featuring a novel bonus term that upper bounds the total variation distance between the estimated and true models.
In contrast to existing approaches for PSRs, our UCB-type algorithms enjoy computational tractability, last-iterate guaranteed near-optimal policy, and guaranteed model accuracy.
arXiv Detail & Related papers (2023-07-01T18:35:21Z) - Contaminated Images Recovery by Implementing Non-negative Matrix
Factorisation [0.0]
We theoretically examine the robustness of the traditional NMF, HCNMF, and L2,1-NMF algorithms and execute sets of experiments to demonstrate the robustness on ORL and Extended YaleB datasets.
Due to the computational cost of these approaches, our final models, such as the HCNMF and L2,1-NMF model, fail to converge within the parameters of this work.
arXiv Detail & Related papers (2022-11-08T13:50:27Z) - Unitary Approximate Message Passing for Matrix Factorization [90.84906091118084]
We consider matrix factorization (MF) with certain constraints, which finds wide applications in various areas.
We develop a Bayesian approach to MF with an efficient message passing implementation, called UAMPMF.
We show that UAMPMF significantly outperforms state-of-the-art algorithms in terms of recovery accuracy, robustness and computational complexity.
arXiv Detail & Related papers (2022-07-31T12:09:32Z) - Feature Weighted Non-negative Matrix Factorization [92.45013716097753]
We propose the Feature weighted Non-negative Matrix Factorization (FNMF) in this paper.
FNMF learns the weights of features adaptively according to their importances.
It can be solved efficiently with the suggested optimization algorithm.
arXiv Detail & Related papers (2021-03-24T21:17:17Z) - Self-supervised Symmetric Nonnegative Matrix Factorization [82.59905231819685]
Symmetric nonnegative factor matrix (SNMF) has demonstrated to be a powerful method for data clustering.
Inspired by ensemble clustering that aims to seek better clustering results, we propose self-supervised SNMF (S$3$NMF)
We take advantage of the sensitivity to code characteristic of SNMF, without relying on any additional information.
arXiv Detail & Related papers (2021-03-02T12:47:40Z) - Nonnegative Matrix Factorization with Toeplitz Penalty [0.0]
Nonnegative Matrix Factorization (NMF) is an unsupervised learning algorithm that produces a linear, parts-based approximation of a data matrix.
We propose a new NMF algorithm that makes use of non-data dependent auxiliary constraints.
arXiv Detail & Related papers (2020-12-07T13:49:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.