Convex Regression in Multidimensions: Suboptimality of Least Squares
Estimators
- URL: http://arxiv.org/abs/2006.02044v1
- Date: Wed, 3 Jun 2020 04:57:05 GMT
- Title: Convex Regression in Multidimensions: Suboptimality of Least Squares
Estimators
- Authors: Gil Kur, Fuchang Gao, Adityanand Guntuboyina and Bodhisattva Sen
- Abstract summary: The least squares estimator (LSE) is shown to be suboptimal in squared error loss in the usual nonparametric regression model.
For each of these families, the risk of the LSE is proved to be of the order $n-2/d$ (up to logarithmic factors)
- Score: 2.0159253466233222
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The least squares estimator (LSE) is shown to be suboptimal in squared error
loss in the usual nonparametric regression model with Gaussian errors for $d
\geq 5$ for each of the following families of functions: (i) convex functions
supported on a polytope (in fixed design), (ii) bounded convex functions
supported on a polytope (in random design), and (iii) convex Lipschitz
functions supported on any convex domain (in random design). For each of these
families, the risk of the LSE is proved to be of the order $n^{-2/d}$ (up to
logarithmic factors) while the minimax risk is $n^{-4/(d+4)}$, for $d \ge 5$.
In addition, the first rate of convergence results (worst case and adaptive)
for the full convex LSE are established for polytopal domains for all $d \geq
1$. Some new metric entropy results for convex functions are also proved which
are of independent interest.
Related papers
- The Sample Complexity Of ERMs In Stochastic Convex Optimization [13.896417716930687]
We show that in fact $tildeO(fracdepsilon+frac1epsilon2)$ data points are also sufficient.
We further generalize the result and show that a similar upper bound holds for all convex bodies.
arXiv Detail & Related papers (2023-11-09T14:29:25Z) - Revisiting Subgradient Method: Complexity and Convergence Beyond
Lipschitz Continuity [29.56022052936922]
Subgradient method is one of the most fundamental algorithmic schemes for nonsmooth optimization.
In this work, we first extend the typical complexity results for the subgradient method to convex and weakly convex objective functions.
We provide convergence results for non-Lipschitz convex and weakly convex objective functions using proper diminishing rules on the step sizes.
arXiv Detail & Related papers (2023-05-23T15:26:36Z) - Efficient displacement convex optimization with particle gradient
descent [57.88860627977882]
Particle gradient descent is widely used to optimize functions of probability measures.
This paper considers particle gradient descent with a finite number of particles and establishes its theoretical guarantees to optimize functions that are emphdisplacement convex in measures.
arXiv Detail & Related papers (2023-02-09T16:35:59Z) - Efficient Minimax Optimal Estimators For Multivariate Convex Regression [1.583842747998493]
We present the first computationally efficient minimax optimal (up to logarithmic factors) estimators for the tasks of (i) $L$-Lipschitz convex regression (ii) $Gamma$-bounded convex regression undertopal support.
This work is the first to show the existence of efficient minimax optimal estimators for non-Donsker classes that their corresponding Least Squares Estimators are provably minimax sub-optimal.
arXiv Detail & Related papers (2022-05-06T17:04:05Z) - Accelerated Primal-Dual Gradient Method for Smooth and Convex-Concave
Saddle-Point Problems with Bilinear Coupling [84.47780064014262]
We study a linear convex-concave saddle-point problem $min_xmax_y f(x) ytopmathbfA x - g(y)
arXiv Detail & Related papers (2021-12-30T20:31:46Z) - Finding Global Minima via Kernel Approximations [90.42048080064849]
We consider the global minimization of smooth functions based solely on function evaluations.
In this paper, we consider an approach that jointly models the function to approximate and finds a global minimum.
arXiv Detail & Related papers (2020-12-22T12:59:30Z) - Optimal Robust Linear Regression in Nearly Linear Time [97.11565882347772]
We study the problem of high-dimensional robust linear regression where a learner is given access to $n$ samples from the generative model $Y = langle X,w* rangle + epsilon$
We propose estimators for this problem under two settings: (i) $X$ is L4-L2 hypercontractive, $mathbbE [XXtop]$ has bounded condition number and $epsilon$ has bounded variance and (ii) $X$ is sub-Gaussian with identity second moment and $epsilon$ is
arXiv Detail & Related papers (2020-07-16T06:44:44Z) - Piecewise Linear Regression via a Difference of Convex Functions [50.89452535187813]
We present a new piecewise linear regression methodology that utilizes fitting a difference of convex functions (DC functions) to the data.
We empirically validate the method, showing it to be practically implementable, and to have comparable performance to existing regression/classification methods on real-world datasets.
arXiv Detail & Related papers (2020-07-05T18:58:47Z) - On Suboptimality of Least Squares with Application to Estimation of
Convex Bodies [74.39616164169131]
We settle an open problem regarding optimality of Least Squares in estimating a convex set from noisy support function measurements in dimension $dgeq 6$.
We establish that Least Squares is sub-optimal, and achieves a rate of $tildeTheta_d(n-2/(d-1))$ whereas the minimax rate is $Theta_d(n-4/(d+3))$.
arXiv Detail & Related papers (2020-06-07T05:19:00Z) - Efficient algorithms for multivariate shape-constrained convex
regression problems [9.281671380673306]
We prove that the least squares estimator is computable via solving a constrained convex programming (QP) problem with $(n+1)d$ variables and at least $n(n-1)$ linear inequality constraints.
For solving the generally very large-scale convex QP, we design two efficient algorithms, one is the symmetric Gauss-Seidel based alternating direction method of multipliers (tt sGS-ADMM), and the other is the proximal augmented Lagrangian method (tt pALM) with the subproblems solved by the semismooth Newton method (t
arXiv Detail & Related papers (2020-02-26T11:18:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.