Robust Matrix Completion with Mixed Data Types
- URL: http://arxiv.org/abs/2005.12415v1
- Date: Mon, 25 May 2020 21:35:10 GMT
- Title: Robust Matrix Completion with Mixed Data Types
- Authors: Daqian Sun, Martin T. Wells
- Abstract summary: We consider the problem of recovering a structured low rank matrix with partially observed entries with mixed data types.
Most approaches assume that there is only one underlying distribution and the low rank constraint is regularized by the matrix Schatten Norm.
We propose a computationally feasible statistical approach with strong recovery guarantees along with an algorithmic framework suited for parallelization to recover a low rank matrix with partially observed entries for mixed data types in one step.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We consider the matrix completion problem of recovering a structured low rank
matrix with partially observed entries with mixed data types. Vast majority of
the solutions have proposed computationally feasible estimators with strong
statistical guarantees for the case where the underlying distribution of data
in the matrix is continuous. A few recent approaches have extended using
similar ideas these estimators to the case where the underlying distributions
belongs to the exponential family. Most of these approaches assume that there
is only one underlying distribution and the low rank constraint is regularized
by the matrix Schatten Norm. We propose a computationally feasible statistical
approach with strong recovery guarantees along with an algorithmic framework
suited for parallelization to recover a low rank matrix with partially observed
entries for mixed data types in one step. We also provide extensive simulation
evidence that corroborate our theoretical results.
Related papers
- Distributional Matrix Completion via Nearest Neighbors in the Wasserstein Space [8.971989179518216]
Given a sparsely observed matrix of empirical distributions, we seek to impute the true distributions associated with both observed and unobserved matrix entries.
We utilize tools from optimal transport to generalize the nearest neighbors method to the distributional setting.
arXiv Detail & Related papers (2024-10-17T00:50:17Z) - Robust spectral clustering with rank statistics [0.3823356975862007]
We consider eigenvector-based clustering applied to a matrix of nonparametric rank statistics that is derived entrywise from the raw, original data matrix.
Our main theoretical contributions are threefold and hold under flexible data generating conditions.
For a dataset of human connectomes, our approach yields parsimonious dimensionality reduction and improved recovery of ground-truth neuroanatomical cluster structure.
arXiv Detail & Related papers (2024-08-19T16:33:44Z) - Entrywise error bounds for low-rank approximations of kernel matrices [55.524284152242096]
We derive entrywise error bounds for low-rank approximations of kernel matrices obtained using the truncated eigen-decomposition.
A key technical innovation is a delocalisation result for the eigenvectors of the kernel matrix corresponding to small eigenvalues.
We validate our theory with an empirical study of a collection of synthetic and real-world datasets.
arXiv Detail & Related papers (2024-05-23T12:26:25Z) - Statistical Inference For Noisy Matrix Completion Incorporating Auxiliary Information [3.9748528039819977]
This paper investigates statistical inference for noisy matrix completion in a semi-supervised model.
We apply an iterative least squares (LS) estimation approach in our considered context.
We show that our method only needs a few iterations, and the resulting entry-wise estimators of the low-rank matrix and the coefficient matrix are guaranteed to have normal distributions.
arXiv Detail & Related papers (2024-03-22T01:06:36Z) - Spectral Entry-wise Matrix Estimation for Low-Rank Reinforcement
Learning [53.445068584013896]
We study matrix estimation problems arising in reinforcement learning (RL) with low-rank structure.
In low-rank bandits, the matrix to be recovered specifies the expected arm rewards, and for low-rank Markov Decision Processes (MDPs), it may for example characterize the transition kernel of the MDP.
We show that simple spectral-based matrix estimation approaches efficiently recover the singular subspaces of the matrix and exhibit nearly-minimal entry-wise error.
arXiv Detail & Related papers (2023-10-10T17:06:41Z) - A Generalized Latent Factor Model Approach to Mixed-data Matrix
Completion with Entrywise Consistency [3.299672391663527]
Matrix completion is a class of machine learning methods that concerns the prediction of missing entries in a partially observed matrix.
We formulate it as a low-rank matrix estimation problem under a general family of non-linear factor models.
We propose entrywise consistent estimators for estimating the low-rank matrix.
arXiv Detail & Related papers (2022-11-17T00:24:47Z) - Learning Graphical Factor Models with Riemannian Optimization [70.13748170371889]
This paper proposes a flexible algorithmic framework for graph learning under low-rank structural constraints.
The problem is expressed as penalized maximum likelihood estimation of an elliptical distribution.
We leverage geometries of positive definite matrices and positive semi-definite matrices of fixed rank that are well suited to elliptical models.
arXiv Detail & Related papers (2022-10-21T13:19:45Z) - Learning Mixtures of Low-Rank Models [89.39877968115833]
We study the problem of learning computational mixtures of low-rank models.
We develop an algorithm that is guaranteed to recover the unknown matrices with near-optimal sample.
In addition, the proposed algorithm is provably stable against random noise.
arXiv Detail & Related papers (2020-09-23T17:53:48Z) - Robust Low-rank Matrix Completion via an Alternating Manifold Proximal
Gradient Continuation Method [47.80060761046752]
Robust low-rank matrix completion (RMC) has been studied extensively for computer vision, signal processing and machine learning applications.
This problem aims to decompose a partially observed matrix into the superposition of a low-rank matrix and a sparse matrix, where the sparse matrix captures the grossly corrupted entries of the matrix.
A widely used approach to tackle RMC is to consider a convex formulation, which minimizes the nuclear norm of the low-rank matrix (to promote low-rankness) and the l1 norm of the sparse matrix (to promote sparsity).
In this paper, motivated by some recent works on low-
arXiv Detail & Related papers (2020-08-18T04:46:22Z) - Covariance Estimation for Matrix-valued Data [9.739753590548796]
We propose a class of distribution-free regularized covariance estimation methods for high-dimensional matrix data.
We formulate a unified framework for estimating bandable covariance, and introduce an efficient algorithm based on rank one unconstrained Kronecker product approximation.
We demonstrate the superior finite-sample performance of our methods using simulations and real applications from a gridded temperature anomalies dataset and a S&P 500 stock data analysis.
arXiv Detail & Related papers (2020-04-11T02:15:26Z) - Relative Error Bound Analysis for Nuclear Norm Regularized Matrix Completion [101.83262280224729]
We develop a relative error bound for nuclear norm regularized matrix completion.
We derive a relative upper bound for recovering the best low-rank approximation of the unknown matrix.
arXiv Detail & Related papers (2015-04-26T13:12:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.