Exponentially Convergent Algorithms for Supervised Matrix Factorization
- URL: http://arxiv.org/abs/2311.11182v1
- Date: Sat, 18 Nov 2023 23:24:02 GMT
- Title: Exponentially Convergent Algorithms for Supervised Matrix Factorization
- Authors: Joowon Lee, Hanbaek Lyu, Weixin Yao
- Abstract summary: Supervised factorization (SMF) is a machine learning method that converges extraction and classification tasks.
Our paper provides a novel framework that 'lifts' SMF as a low-rank estimation problem in a combined factor space estimation.
- Score: 2.1485350418225244
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Supervised matrix factorization (SMF) is a classical machine learning method
that simultaneously seeks feature extraction and classification tasks, which
are not necessarily a priori aligned objectives. Our goal is to use SMF to
learn low-rank latent factors that offer interpretable, data-reconstructive,
and class-discriminative features, addressing challenges posed by
high-dimensional data. Training SMF model involves solving a nonconvex and
possibly constrained optimization with at least three blocks of parameters.
Known algorithms are either heuristic or provide weak convergence guarantees
for special cases. In this paper, we provide a novel framework that 'lifts' SMF
as a low-rank matrix estimation problem in a combined factor space and propose
an efficient algorithm that provably converges exponentially fast to a global
minimizer of the objective with arbitrary initialization under mild
assumptions. Our framework applies to a wide range of SMF-type problems for
multi-class classification with auxiliary features. To showcase an application,
we demonstrate that our algorithm successfully identified well-known
cancer-associated gene groups for various cancers.
Related papers
- Spectral Entry-wise Matrix Estimation for Low-Rank Reinforcement
Learning [53.445068584013896]
We study matrix estimation problems arising in reinforcement learning (RL) with low-rank structure.
In low-rank bandits, the matrix to be recovered specifies the expected arm rewards, and for low-rank Markov Decision Processes (MDPs), it may for example characterize the transition kernel of the MDP.
We show that simple spectral-based matrix estimation approaches efficiently recover the singular subspaces of the matrix and exhibit nearly-minimal entry-wise error.
arXiv Detail & Related papers (2023-10-10T17:06:41Z) - Multivariate Systemic Risk Measures and Computation by Deep Learning
Algorithms [63.03966552670014]
We discuss the key related theoretical aspects, with a particular focus on the fairness properties of primal optima and associated risk allocations.
The algorithms we provide allow for learning primals, optima for the dual representation and corresponding fair risk allocations.
arXiv Detail & Related papers (2023-02-02T22:16:49Z) - Quadratic Matrix Factorization with Applications to Manifold Learning [1.6795461001108094]
We propose a quadratic matrix factorization (QMF) framework to learn the curved manifold on which the dataset lies.
Algorithmically, we propose an alternating minimization algorithm to optimize QMF and establish its theoretical convergence properties.
Experiments on a synthetic manifold learning dataset and two real datasets, including the MNIST handwritten dataset and a cryogenic electron microscopy dataset, demonstrate the superiority of the proposed method over its competitors.
arXiv Detail & Related papers (2023-01-30T15:09:00Z) - Supervised Class-pairwise NMF for Data Representation and Classification [2.7320863258816512]
Non-negative Matrix factorization (NMF) based methods add new terms to the cost function to adapt the model to specific tasks.
NMF method adopts unsupervised approaches to estimate the factorizing matrices.
arXiv Detail & Related papers (2022-09-28T04:33:03Z) - Unitary Approximate Message Passing for Matrix Factorization [90.84906091118084]
We consider matrix factorization (MF) with certain constraints, which finds wide applications in various areas.
We develop a Bayesian approach to MF with an efficient message passing implementation, called UAMPMF.
We show that UAMPMF significantly outperforms state-of-the-art algorithms in terms of recovery accuracy, robustness and computational complexity.
arXiv Detail & Related papers (2022-07-31T12:09:32Z) - The Dynamics of Riemannian Robbins-Monro Algorithms [101.29301565229265]
We propose a family of Riemannian algorithms generalizing and extending the seminal approximation framework of Robbins and Monro.
Compared to their Euclidean counterparts, Riemannian algorithms are much less understood due to lack of a global linear structure on the manifold.
We provide a general template of almost sure convergence results that mirrors and extends the existing theory for Euclidean Robbins-Monro schemes.
arXiv Detail & Related papers (2022-06-14T12:30:11Z) - Algorithmic Foundations of Empirical X-risk Minimization [51.58884973792057]
This manuscript introduces a new optimization framework machine learning and AI, named bf empirical X-risk baseline (EXM).
X-risk is a term introduced to represent a family of compositional measures or objectives.
arXiv Detail & Related papers (2022-06-01T12:22:56Z) - Learning Multiresolution Matrix Factorization and its Wavelet Networks
on Graphs [11.256959274636724]
Multiresolution Matrix Factorization (MMF) is unusual amongst fast matrix factorization algorithms.
We propose a learnable version of MMF that carfully optimize the factorization with a combination of reinforcement learning and Stiefel manifold optimization.
We show that the resulting wavelet basis far outperforms prior MMF algorithms and provides the first version of this type of factorization that can be robustly deployed on standard learning tasks.
arXiv Detail & Related papers (2021-11-02T23:14:17Z) - Algorithms for Nonnegative Matrix Factorization with the
Kullback-Leibler Divergence [20.671178429005973]
Kullback-Leibler (KL) divergence is one of the most widely used objective function for nonnegative matrix factorization (NMF)
We propose three new algorithms that guarantee the non-increasingness of the objective function.
We conduct extensive numerical experiments to provide a comprehensive picture of the performances of the KL NMF algorithms.
arXiv Detail & Related papers (2020-10-05T11:51:39Z) - Positive Semidefinite Matrix Factorization: A Connection with Phase
Retrieval and Affine Rank Minimization [71.57324258813674]
We show that PSDMF algorithms can be designed based on phase retrieval (PR) and affine rank minimization (ARM) algorithms.
Motivated by this idea, we introduce a new family of PSDMF algorithms based on iterative hard thresholding (IHT)
arXiv Detail & Related papers (2020-07-24T06:10:19Z) - Simplex-Structured Matrix Factorization: Sparsity-based Identifiability
and Provably Correct Algorithms [21.737226432466496]
We provide novel algorithms with identifiability guarantees for simplex-structured matrix factorization.
We illustrate the effectiveness of our approach on synthetic data sets and hyperspectral images.
arXiv Detail & Related papers (2020-07-22T14:01:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.