Computational Barriers to Estimation from Low-Degree Polynomials
- URL: http://arxiv.org/abs/2008.02269v2
- Date: Sat, 18 Jun 2022 23:30:54 GMT
- Title: Computational Barriers to Estimation from Low-Degree Polynomials
- Authors: Tselil Schramm and Alexander S. Wein
- Abstract summary: We study the power of low-degrees for the task of detecting the presence of hidden structures.
For a large class of "signal plus noise" problems, we give a user-friendly lower bound for the best possible mean squared error achievable by any degree.
As applications, we give a tight characterization of the low-degree minimum mean squared error for the planted submatrix and planted dense subgraph problems.
- Score: 81.67886161671379
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: One fundamental goal of high-dimensional statistics is to detect or recover
planted structure (such as a low-rank matrix) hidden in noisy data. A growing
body of work studies low-degree polynomials as a restricted model of
computation for such problems: it has been demonstrated in various settings
that low-degree polynomials of the data can match the statistical performance
of the best known polynomial-time algorithms. Prior work has studied the power
of low-degree polynomials for the task of detecting the presence of hidden
structures. In this work, we extend these methods to address problems of
estimation and recovery (instead of detection). For a large class of "signal
plus noise" problems, we give a user-friendly lower bound for the best possible
mean squared error achievable by any degree-D polynomial. To our knowledge,
these are the first results to establish low-degree hardness of recovery
problems for which the associated detection problem is easy. As applications,
we give a tight characterization of the low-degree minimum mean squared error
for the planted submatrix and planted dense subgraph problems, resolving (in
the low-degree framework) open problems about the computational complexity of
recovery in both cases.
Related papers
- A Sample Efficient Alternating Minimization-based Algorithm For Robust Phase Retrieval [56.67706781191521]
In this work, we present a robust phase retrieval problem where the task is to recover an unknown signal.
Our proposed oracle avoids the need for computationally spectral descent, using a simple gradient step and outliers.
arXiv Detail & Related papers (2024-09-07T06:37:23Z) - Spectral Entry-wise Matrix Estimation for Low-Rank Reinforcement
Learning [53.445068584013896]
We study matrix estimation problems arising in reinforcement learning (RL) with low-rank structure.
In low-rank bandits, the matrix to be recovered specifies the expected arm rewards, and for low-rank Markov Decision Processes (MDPs), it may for example characterize the transition kernel of the MDP.
We show that simple spectral-based matrix estimation approaches efficiently recover the singular subspaces of the matrix and exhibit nearly-minimal entry-wise error.
arXiv Detail & Related papers (2023-10-10T17:06:41Z) - Efficient Model-Free Exploration in Low-Rank MDPs [76.87340323826945]
Low-Rank Markov Decision Processes offer a simple, yet expressive framework for RL with function approximation.
Existing algorithms are either (1) computationally intractable, or (2) reliant upon restrictive statistical assumptions.
We propose the first provably sample-efficient algorithm for exploration in Low-Rank MDPs.
arXiv Detail & Related papers (2023-07-08T15:41:48Z) - Sparse resultant based minimal solvers in computer vision and their
connection with the action matrix [17.31412310131552]
We show that for some camera geometry problems our extra-based method leads to smaller and more stable solvers than the state-of-the-art Grobner basis-based solvers.
It provides a competitive alternative to popularner basis-based methods for minimal problems in computer vision.
arXiv Detail & Related papers (2023-01-16T14:25:19Z) - Causal discovery under a confounder blanket [9.196779204457059]
Inferring causal relationships from observational data is rarely straightforward, but the problem is especially difficult in high dimensions.
We relax these assumptions and focus on an important but more specialized problem, namely recovering a directed acyclic subgraph.
We derive a complete algorithm for identifying causal relationships under these conditions and implement testing procedures.
arXiv Detail & Related papers (2022-05-11T18:10:45Z) - Minimax rate of consistency for linear models with missing values [0.0]
Missing values arise in most real-world data sets due to the aggregation of multiple sources and intrinsically missing information (sensor failure, unanswered questions in surveys...).
In this paper, we focus on the extensively-studied linear models, but in presence of missing values, which turns out to be quite a challenging task.
This eventually requires to solve a number of learning tasks, exponential in the number of input features, which makes predictions impossible for current real-world datasets.
arXiv Detail & Related papers (2022-02-03T08:45:34Z) - Learning Mixtures of Low-Rank Models [89.39877968115833]
We study the problem of learning computational mixtures of low-rank models.
We develop an algorithm that is guaranteed to recover the unknown matrices with near-optimal sample.
In addition, the proposed algorithm is provably stable against random noise.
arXiv Detail & Related papers (2020-09-23T17:53:48Z) - Statistical Query Algorithms and Low-Degree Tests Are Almost Equivalent [29.684442397855197]
We study two of the most popular restricted computational models, the statistical query framework and low-degree corollas, in the context of high-dimensional hypothesis testing.
Under mild conditions on the testing problem, the two classes of algorithms are essentially equivalent in power.
Asries, we obtain new statistical query lower bounds for sparse PCA, tensor PCA and several variants of the planted clique problem.
arXiv Detail & Related papers (2020-09-13T22:55:18Z) - Reducibility and Statistical-Computational Gaps from Secret Leakage [19.25775832101647]
We show that a slight generalization of the planted clique conjecture -- secret leakage planted clique -- gives rise to a variety of new average-case reduction techniques.
Using variants of the planted clique conjecture for specific forms of secret leakage planted clique, we deduce tight statistical-computational tradeoffs.
arXiv Detail & Related papers (2020-05-16T20:56:09Z) - High-Dimensional Robust Mean Estimation via Gradient Descent [73.61354272612752]
We show that the problem of robust mean estimation in the presence of a constant adversarial fraction can be solved by gradient descent.
Our work establishes an intriguing connection between the near non-lemma estimation and robust statistics.
arXiv Detail & Related papers (2020-05-04T10:48:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.