Randomised Composition and Small-Bias Minimax
- URL: http://arxiv.org/abs/2208.12896v1
- Date: Fri, 26 Aug 2022 23:32:19 GMT
- Title: Randomised Composition and Small-Bias Minimax
- Authors: Shalev Ben-David, Eric Blais, Mika G\"o\"os, Gilbert Maystre
- Abstract summary: We prove two results about query complexity $mathrmR(f)$.
First, we introduce a "linearised" complexity measure $mathrmLR$ and show that it satisfies an inner-conflict composition theorem: $mathrmR(f) geq Omega(mathrmR(f) mathrmLR(g))$ for all partial $f$ and $g$, and moreover, $mathrmLR$ is the largest possible measure with this property.
- Score: 0.9252523881586053
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: We prove two results about randomised query complexity $\mathrm{R}(f)$.
First, we introduce a "linearised" complexity measure $\mathrm{LR}$ and show
that it satisfies an inner-optimal composition theorem: $\mathrm{R}(f\circ g)
\geq \Omega(\mathrm{R}(f) \mathrm{LR}(g))$ for all partial $f$ and $g$, and
moreover, $\mathrm{LR}$ is the largest possible measure with this property. In
particular, $\mathrm{LR}$ can be polynomially larger than previous measures
that satisfy an inner composition theorem, such as the max-conflict complexity
of Gavinsky, Lee, Santha, and Sanyal (ICALP 2019).
Our second result addresses a question of Yao (FOCS 1977). He asked if
$\epsilon$-error expected query complexity $\bar{\mathrm{R}}_{\epsilon}(f)$
admits a distributional characterisation relative to some hard input
distribution. Vereshchagin (TCS 1998) answered this question affirmatively in
the bounded-error case. We show that an analogous theorem fails in the
small-bias case $\epsilon=1/2-o(1)$.
Related papers
- The Communication Complexity of Approximating Matrix Rank [50.6867896228563]
We show that this problem has randomized communication complexity $Omega(frac1kcdot n2log|mathbbF|)$.
As an application, we obtain an $Omega(frac1kcdot n2log|mathbbF|)$ space lower bound for any streaming algorithm with $k$ passes.
arXiv Detail & Related papers (2024-10-26T06:21:42Z) - Statistical Query Lower Bounds for Learning Truncated Gaussians [43.452452030671694]
We show that the complexity of any SQ algorithm for this problem is $dmathrmpoly (1/epsilon)$, even when the class $mathcalC$ is simple so that $mathrmpoly(d/epsilon) samples information-theoretically suffice.
arXiv Detail & Related papers (2024-03-04T18:30:33Z) - Testing Closeness of Multivariate Distributions via Ramsey Theory [40.926523210945064]
We investigate the statistical task of closeness (or equivalence) testing for multidimensional distributions.
Specifically, given sample access to two unknown distributions $mathbf p, mathbf q$ on $mathbb Rd$, we want to distinguish between the case that $mathbf p=mathbf q$ versus $|mathbf p-mathbf q|_A_k > epsilon$.
Our main result is the first closeness tester for this problem with em sub-learning sample complexity in any fixed dimension.
arXiv Detail & Related papers (2023-11-22T04:34:09Z) - Near-Optimal Bounds for Learning Gaussian Halfspaces with Random
Classification Noise [50.64137465792738]
We show that any efficient SQ algorithm for the problem requires sample complexity at least $Omega(d1/2/(maxp, epsilon)2)$.
Our lower bound suggests that this quadratic dependence on $1/epsilon$ is inherent for efficient algorithms.
arXiv Detail & Related papers (2023-07-13T18:59:28Z) - Learning a Single Neuron with Adversarial Label Noise via Gradient
Descent [50.659479930171585]
We study a function of the form $mathbfxmapstosigma(mathbfwcdotmathbfx)$ for monotone activations.
The goal of the learner is to output a hypothesis vector $mathbfw$ that $F(mathbbw)=C, epsilon$ with high probability.
arXiv Detail & Related papers (2022-06-17T17:55:43Z) - Low-degree learning and the metric entropy of polynomials [44.99833362998488]
We prove that any (deterministic or randomized) algorithm which learns $mathscrF_nd$ with $L$-accuracy $varepsilon$ requires at least $Omega(sqrtvarepsilon)2dlog n leq log mathsfM(mathscrF_n,d,|cdot|_L,varepsilon) satisfies the two-sided estimate $$c (1-varepsilon)2dlog
arXiv Detail & Related papers (2022-03-17T23:52:08Z) - The Complexity of Dynamic Least-Squares Regression [11.815510373329337]
complexity of dynamic least-squares regression.
Goal is to maintain an $epsilon-approximate solution to $min_mathbfx(t)| mathbfA(t) mathbfb(t) |$ for all $tin.
arXiv Detail & Related papers (2022-01-01T18:36:17Z) - Threshold Phenomena in Learning Halfspaces with Massart Noise [56.01192577666607]
We study the problem of PAC learning halfspaces on $mathbbRd$ with Massart noise under Gaussian marginals.
Our results qualitatively characterize the complexity of learning halfspaces in the Massart model.
arXiv Detail & Related papers (2021-08-19T16:16:48Z) - An Optimal Separation of Randomized and Quantum Query Complexity [67.19751155411075]
We prove that for every decision tree, the absolute values of the Fourier coefficients of a given order $ellsqrtbinomdell (1+log n)ell-1,$ sum to at most $cellsqrtbinomdell (1+log n)ell-1,$ where $n$ is the number of variables, $d$ is the tree depth, and $c>0$ is an absolute constant.
arXiv Detail & Related papers (2020-08-24T06:50:57Z) - A Tight Composition Theorem for the Randomized Query Complexity of
Partial Functions [1.2284934135116514]
We prove two new results about the randomized query complexity of composed functions.
We show that for all $f$ and $g$, $(Rcirc g)=Omega(mathopnoisyR(f)cdot R(g))$, where $mathopnoisyR(f)$ is a measure describing the cost of computing $f$ on noisy inputs.
arXiv Detail & Related papers (2020-02-25T11:58:14Z) - On the Complexity of Minimizing Convex Finite Sums Without Using the
Indices of the Individual Functions [62.01594253618911]
We exploit the finite noise structure of finite sums to derive a matching $O(n2)$-upper bound under the global oracle model.
Following a similar approach, we propose a novel adaptation of SVRG which is both emphcompatible with oracles, and achieves complexity bounds of $tildeO(n2+nsqrtL/mu)log (1/epsilon)$ and $O(nsqrtL/epsilon)$, for $mu>0$ and $mu=0$
arXiv Detail & Related papers (2020-02-09T03:39:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.