Sandwiching Polynomials for Geometric Concepts with Low Intrinsic Dimension
- URL: http://arxiv.org/abs/2602.24178v1
- Date: Fri, 27 Feb 2026 16:59:18 GMT
- Title: Sandwiching Polynomials for Geometric Concepts with Low Intrinsic Dimension
- Authors: Adam R. Klivans, Konstantinos Stavropoulos, Arsen Vasilyan,
- Abstract summary: We give a new method for constructing low-degree sandwichings that yield greatly improved degree bounds for several fundamental function classes and marginal distributions.<n>Our proof is relatively simple and directly uses the smoothness of the target function's boundary to construct sandwiching Lipschitz functions.
- Score: 23.43080600040766
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent work has shown the surprising power of low-degree sandwiching polynomial approximators in the context of challenging learning settings such as learning with distribution shift, testable learning, and learning with contamination. A pair of sandwiching polynomials approximate a target function in expectation while also providing pointwise upper and lower bounds on the function's values. In this paper, we give a new method for constructing low-degree sandwiching polynomials that yield greatly improved degree bounds for several fundamental function classes and marginal distributions. In particular, we obtain degree $\mathrm{poly}(k)$ sandwiching polynomials for functions of $k$ halfspaces under the Gaussian distribution, improving exponentially over the prior $2^{O(k)}$ bound. More broadly, our approach applies to function classes that are low-dimensional and have smooth boundary. In contrast to prior work, our proof is relatively simple and directly uses the smoothness of the target function's boundary to construct sandwiching Lipschitz functions, which are amenable to results from high-dimensional approximation theory. For low-dimensional polynomial threshold functions (PTFs) with respect to Gaussians, we obtain doubly exponential improvements without applying the FT-mollification method of Kane used in the best previous result.
Related papers
- Constructive Approximation under Carleman's Condition, with Applications to Smoothed Analysis [13.02728413691724]
We develop a fairly tight analogue of the underlying DenjoyCarleman via complex analysis.<n>We establish $L2$ approximation-theoretic results for functions over general classes of distributions.<n>As another application, we show that the Paley--Wiener class of functions band to $[-,]$ admits superexponential rates of approximation over all strictly sub-exponential distributions.
arXiv Detail & Related papers (2025-12-04T01:40:05Z) - On Uniform Weighted Deep Polynomial approximation [0.0]
We introduce and analyze a class of weighted deep approximants tailored for functions with asymmetric behavior-growing on one side and decaying on the other.<n>We show numerically that this framework outperforms Taylor, Chebyshev, and standard deep approximants, even when all use the same number of parameters.
arXiv Detail & Related papers (2025-06-26T14:25:32Z) - Covering Number of Real Algebraic Varieties and Beyond: Improved Bounds and Applications [8.438718130535296]
We prove upper bounds on the covering number of numerous sets in Euclidean space.<n>We illustrate the power of the result on three computational applications.
arXiv Detail & Related papers (2023-11-09T03:06:59Z) - Stochastic Submodular Maximization via Polynomial Estimators [13.498923494159312]
We focus on maximizing submodular functions that are defined as expectations over a class of submodular functions with an unknown distribution.
We show that for monotone functions of this form, greedy continuous algorithm attains an approximation ratio (in expectation) arbitrarily close to $(1-1/e) approx 63%$ using a estimation.
arXiv Detail & Related papers (2023-03-17T13:32:33Z) - The Optimality of Polynomial Regression for Agnostic Learning under
Gaussian Marginals [47.81107898315438]
We develop a method for finding hard families of examples for a wide class of problems by using duality LP.
We show that the $L1$-regression is essentially best possible, and therefore that the computational difficulty of learning a concept class is closely related to the degree required to approximate any function from the class in $L1$-norm.
arXiv Detail & Related papers (2021-02-08T18:06:32Z) - Finding Global Minima via Kernel Approximations [90.42048080064849]
We consider the global minimization of smooth functions based solely on function evaluations.
In this paper, we consider an approach that jointly models the function to approximate and finds a global minimum.
arXiv Detail & Related papers (2020-12-22T12:59:30Z) - On Function Approximation in Reinforcement Learning: Optimism in the
Face of Large State Spaces [208.67848059021915]
We study the exploration-exploitation tradeoff at the core of reinforcement learning.
In particular, we prove that the complexity of the function class $mathcalF$ characterizes the complexity of the function.
Our regret bounds are independent of the number of episodes.
arXiv Detail & Related papers (2020-11-09T18:32:22Z) - Reinforcement Learning with General Value Function Approximation:
Provably Efficient Approach via Bounded Eluder Dimension [124.7752517531109]
We establish a provably efficient reinforcement learning algorithm with general value function approximation.
We show that our algorithm achieves a regret bound of $widetildeO(mathrmpoly(dH)sqrtT)$ where $d$ is a complexity measure.
Our theory generalizes recent progress on RL with linear value function approximation and does not make explicit assumptions on the model of the environment.
arXiv Detail & Related papers (2020-05-21T17:36:09Z) - On Sharpness of Error Bounds for Multivariate Neural Network
Approximation [0.0]
The paper deals with best non-linear approximation by such sums of ridge functions.
Error bounds are presented in terms of moduli of smoothness.
arXiv Detail & Related papers (2020-04-05T14:00:52Z) - SURF: A Simple, Universal, Robust, Fast Distribution Learning Algorithm [64.13217062232874]
SURF is an algorithm for approximating distributions by piecewises.
It outperforms state-of-the-art algorithms in experiments.
arXiv Detail & Related papers (2020-02-22T01:03:33Z) - Complexity of Finding Stationary Points of Nonsmooth Nonconvex Functions [84.49087114959872]
We provide the first non-asymptotic analysis for finding stationary points of nonsmooth, nonsmooth functions.
In particular, we study Hadamard semi-differentiable functions, perhaps the largest class of nonsmooth functions.
arXiv Detail & Related papers (2020-02-10T23:23:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.