Truncated Matrix Completion - An Empirical Study
- URL: http://arxiv.org/abs/2504.09873v1
- Date: Mon, 14 Apr 2025 04:42:00 GMT
- Title: Truncated Matrix Completion - An Empirical Study
- Authors: Rishhabh Naik, Nisarg Trivedi, Davoud Ataee Tarzanagh, Laura Balzano,
- Abstract summary: Low-rank Matrix Completion describes the problem where we wish to recover missing entries of partially observed low-rank matrix.<n>We consider various settings where the sampling mask is dependent on the underlying data values, motivated by applications in sensing, sequential decision-making, and recommender systems.
- Score: 7.912206996605676
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Low-rank Matrix Completion (LRMC) describes the problem where we wish to recover missing entries of partially observed low-rank matrix. Most existing matrix completion work deals with sampling procedures that are independent of the underlying data values. While this assumption allows the derivation of nice theoretical guarantees, it seldom holds in real-world applications. In this paper, we consider various settings where the sampling mask is dependent on the underlying data values, motivated by applications in sensing, sequential decision-making, and recommender systems. Through a series of experiments, we study and compare the performance of various LRMC algorithms that were originally successful for data-independent sampling patterns.
Related papers
- Optimal Estimation of Shared Singular Subspaces across Multiple Noisy Matrices [3.3373545585860596]
This study focuses on estimating shared (left) singular subspaces across multiple matrices within a low-rank matrix denoising framework.
We establish that Stack-SVD achieves minimax rate-optimality when the true singular subspaces of the signal matrices are identical.
For various cases of partial sharing, we rigorously characterize the conditions under which Stack-SVD remains effective, achieves minimax optimality, or fails to deliver consistent estimates.
arXiv Detail & Related papers (2024-11-26T02:49:30Z) - Multiple Testing of Linear Forms for Noisy Matrix Completion [13.269597888405759]
We develop a general approach to overcome difficulties by introducing new statistics for individual tests with sharp new statistics.<n>We show that valid FDR control can be achieved with guaranteed power under nearly optimal sample size requirements.
arXiv Detail & Related papers (2023-12-01T02:53:20Z) - Spectral Entry-wise Matrix Estimation for Low-Rank Reinforcement
Learning [53.445068584013896]
We study matrix estimation problems arising in reinforcement learning (RL) with low-rank structure.
In low-rank bandits, the matrix to be recovered specifies the expected arm rewards, and for low-rank Markov Decision Processes (MDPs), it may for example characterize the transition kernel of the MDP.
We show that simple spectral-based matrix estimation approaches efficiently recover the singular subspaces of the matrix and exhibit nearly-minimal entry-wise error.
arXiv Detail & Related papers (2023-10-10T17:06:41Z) - The Ordered Matrix Dirichlet for Modeling Ordinal Dynamics [54.96229007229786]
We propose the Ordered Matrix Dirichlet (OMD) to map latent states to observed action types.
Models built on the OMD recover interpretable latent states and show superior forecasting performance in few-shot settings.
arXiv Detail & Related papers (2022-12-08T08:04:26Z) - Causal Matrix Completion [15.599296461516984]
Matrix completion is the study of recovering an underlying matrix from a sparse subset of noisy observations.
Traditionally, it is assumed that the entries of the matrix are "missing completely at random"
arXiv Detail & Related papers (2021-09-30T14:17:56Z) - BELT: Blockwise Missing Embedding Learning Transfomer [9.341699514447113]
We propose the model bf Blockwise missing bf Embedding bf Learning bf Transformer (BELT) to treat row-wise/column-wise missingness.
Specifically, our proposed method aims at efficient matrix recovery when every pair of matrices from multiple sources has an overlap.
arXiv Detail & Related papers (2021-05-21T13:55:30Z) - Sparse PCA via $l_{2,p}$-Norm Regularization for Unsupervised Feature
Selection [138.97647716793333]
We propose a simple and efficient unsupervised feature selection method, by combining reconstruction error with $l_2,p$-norm regularization.
We present an efficient optimization algorithm to solve the proposed unsupervised model, and analyse the convergence and computational complexity of the algorithm theoretically.
arXiv Detail & Related papers (2020-12-29T04:08:38Z) - Robust Low-rank Matrix Completion via an Alternating Manifold Proximal
Gradient Continuation Method [47.80060761046752]
Robust low-rank matrix completion (RMC) has been studied extensively for computer vision, signal processing and machine learning applications.
This problem aims to decompose a partially observed matrix into the superposition of a low-rank matrix and a sparse matrix, where the sparse matrix captures the grossly corrupted entries of the matrix.
A widely used approach to tackle RMC is to consider a convex formulation, which minimizes the nuclear norm of the low-rank matrix (to promote low-rankness) and the l1 norm of the sparse matrix (to promote sparsity).
In this paper, motivated by some recent works on low-
arXiv Detail & Related papers (2020-08-18T04:46:22Z) - A Scalable, Adaptive and Sound Nonconvex Regularizer for Low-rank Matrix
Completion [60.52730146391456]
We propose a new non scalable low-rank regularizer called "nuclear Frobenius norm" regularizer, which is adaptive and sound.
It bypasses the computation of singular values and allows fast optimization by algorithms.
It obtains state-of-the-art recovery performance while being the fastest in existing matrix learning methods.
arXiv Detail & Related papers (2020-08-14T18:47:58Z) - Robust Matrix Completion with Mixed Data Types [0.0]
We consider the problem of recovering a structured low rank matrix with partially observed entries with mixed data types.
Most approaches assume that there is only one underlying distribution and the low rank constraint is regularized by the matrix Schatten Norm.
We propose a computationally feasible statistical approach with strong recovery guarantees along with an algorithmic framework suited for parallelization to recover a low rank matrix with partially observed entries for mixed data types in one step.
arXiv Detail & Related papers (2020-05-25T21:35:10Z) - Modal Regression based Structured Low-rank Matrix Recovery for
Multi-view Learning [70.57193072829288]
Low-rank Multi-view Subspace Learning has shown great potential in cross-view classification in recent years.
Existing LMvSL based methods are incapable of well handling view discrepancy and discriminancy simultaneously.
We propose Structured Low-rank Matrix Recovery (SLMR), a unique method of effectively removing view discrepancy and improving discriminancy.
arXiv Detail & Related papers (2020-03-22T03:57:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.