$\ell_0$ factor analysis
- URL: http://arxiv.org/abs/2411.08468v1
- Date: Wed, 13 Nov 2024 09:40:23 GMT
- Title: $\ell_0$ factor analysis
- Authors: Linyang Wang, Wanquan Liu, Bin Zhu,
- Abstract summary: We formulate an optimization problem using the nuclear norm for the low-rank component.
An alternating minimization algorithm is designed for the solution of the optimization problem.
The effectiveness of the algorithm is verified via simulations on synthetic and real datasets.
- Score: 25.14792952830999
- License:
- Abstract: Factor Analysis is about finding a low-rank plus sparse additive decomposition from a noisy estimate of the signal covariance matrix. In order to get such a decomposition, we formulate an optimization problem using the nuclear norm for the low-rank component, the $\ell_0$ norm for the sparse component, and the Kullback-Leibler divergence to control the residual in the sample covariance matrix. An alternating minimization algorithm is designed for the solution of the optimization problem. The effectiveness of the algorithm is verified via simulations on synthetic and real datasets.
Related papers
- A Sample Efficient Alternating Minimization-based Algorithm For Robust Phase Retrieval [56.67706781191521]
In this work, we present a robust phase retrieval problem where the task is to recover an unknown signal.
Our proposed oracle avoids the need for computationally spectral descent, using a simple gradient step and outliers.
arXiv Detail & Related papers (2024-09-07T06:37:23Z) - Stochastic Optimization for Non-convex Problem with Inexact Hessian
Matrix, Gradient, and Function [99.31457740916815]
Trust-region (TR) and adaptive regularization using cubics have proven to have some very appealing theoretical properties.
We show that TR and ARC methods can simultaneously provide inexact computations of the Hessian, gradient, and function values.
arXiv Detail & Related papers (2023-10-18T10:29:58Z) - An inexact LPA for DC composite optimization and application to matrix completions with outliers [5.746154410100363]
This paper concerns a class of composite optimization problems.
By leveraging the composite structure, we provide a condition for the potential function to have the KL property of $1/2$ at the iterate sequence.
arXiv Detail & Related papers (2023-03-29T16:15:34Z) - Alternating Mahalanobis Distance Minimization for Stable and Accurate CP
Decomposition [4.847980206213335]
We introduce a new formulation for deriving singular values and vectors of a tensor by considering the critical points of a function different from what is used in the previous work.
We show that a subsweep of this algorithm can achieve a superlinear convergence rate for exact CPD with known rank.
We then view the algorithm as optimizing a Mahalanobis distance with respect to each factor with ground metric dependent on the other factors.
arXiv Detail & Related papers (2022-04-14T19:56:36Z) - Matrix Reordering for Noisy Disordered Matrices: Optimality and
Computationally Efficient Algorithms [9.245687221460654]
Motivated by applications in single-cell biology and metagenomics, we investigate the problem of matrixing based on a noisy monotone Toeplitz matrix model.
We establish fundamental statistical limit for this problem in a decision-theoretic framework and demonstrate that a constrained least squares rate.
To address this, we propose a novel-time adaptive sorting algorithm with guaranteed performance improvement.
arXiv Detail & Related papers (2022-01-17T14:53:52Z) - Stochastic regularized majorization-minimization with weakly convex and
multi-convex surrogates [0.0]
We show that the first optimality gap of the proposed algorithm decays at the rate of the expected loss for various methods under nontens dependent data setting.
We obtain first convergence point for various methods under nontens dependent data setting.
arXiv Detail & Related papers (2022-01-05T15:17:35Z) - Solving weakly supervised regression problem using low-rank manifold
regularization [77.34726150561087]
We solve a weakly supervised regression problem.
Under "weakly" we understand that for some training points the labels are known, for some unknown, and for others uncertain due to the presence of random noise or other reasons such as lack of resources.
In the numerical section, we applied the suggested method to artificial and real datasets using Monte-Carlo modeling.
arXiv Detail & Related papers (2021-04-13T23:21:01Z) - Analysis of Truncated Orthogonal Iteration for Sparse Eigenvector
Problems [78.95866278697777]
We propose two variants of the Truncated Orthogonal Iteration to compute multiple leading eigenvectors with sparsity constraints simultaneously.
We then apply our algorithms to solve the sparse principle component analysis problem for a wide range of test datasets.
arXiv Detail & Related papers (2021-03-24T23:11:32Z) - Sparse PCA via $l_{2,p}$-Norm Regularization for Unsupervised Feature
Selection [138.97647716793333]
We propose a simple and efficient unsupervised feature selection method, by combining reconstruction error with $l_2,p$-norm regularization.
We present an efficient optimization algorithm to solve the proposed unsupervised model, and analyse the convergence and computational complexity of the algorithm theoretically.
arXiv Detail & Related papers (2020-12-29T04:08:38Z) - Robust Low-rank Matrix Completion via an Alternating Manifold Proximal
Gradient Continuation Method [47.80060761046752]
Robust low-rank matrix completion (RMC) has been studied extensively for computer vision, signal processing and machine learning applications.
This problem aims to decompose a partially observed matrix into the superposition of a low-rank matrix and a sparse matrix, where the sparse matrix captures the grossly corrupted entries of the matrix.
A widely used approach to tackle RMC is to consider a convex formulation, which minimizes the nuclear norm of the low-rank matrix (to promote low-rankness) and the l1 norm of the sparse matrix (to promote sparsity).
In this paper, motivated by some recent works on low-
arXiv Detail & Related papers (2020-08-18T04:46:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.