Localization, Convexity, and Star Aggregation
- URL: http://arxiv.org/abs/2105.08866v1
- Date: Wed, 19 May 2021 00:47:59 GMT
- Title: Localization, Convexity, and Star Aggregation
- Authors: Suhas Vijaykumar
- Abstract summary: Offset Rademacher complexities have been shown to imply sharp, linear-dependent upper bounds for the square loss.
We show that in the statistical setting, the offset bound can be generalized to any loss satisfying certain uniform convexity.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Offset Rademacher complexities have been shown to imply sharp, data-dependent
upper bounds for the square loss in a broad class of problems including
improper statistical learning and online learning. We show that in the
statistical setting, the offset complexity upper bound can be generalized to
any loss satisfying a certain uniform convexity condition. Amazingly, this
condition is shown to also capture exponential concavity and self-concordance,
uniting several apparently disparate results. By a unified geometric argument,
these bounds translate directly to improper learning in a non-convex class
using Audibert's "star algorithm." As applications, we recover the optimal
rates for proper and improper learning with the $p$-loss, $1 < p < \infty$,
closing the gap for $p > 2$, and show that improper variants of empirical risk
minimization can attain fast rates for logistic regression and other
generalized linear models.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.