On characterizations of learnability with computable learners
- URL: http://arxiv.org/abs/2202.05041v1
- Date: Thu, 10 Feb 2022 13:57:20 GMT
- Title: On characterizations of learnability with computable learners
- Authors: Tom F. Sterkenburg
- Abstract summary: We study computable PAC (CPAC) learning as introduced by Agarwal et al.
We give a characterization of a closely related notion of strong CPAC learning.
We consider undecidability of (computable) PAC learnability.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study computable PAC (CPAC) learning as introduced by Agarwal et al.
(2020). First, we consider the main open question of finding characterizations
of proper and improper CPAC learning. We give a characterization of a closely
related notion of strong CPAC learning, and we provide a negative answer to the
open problem posed by Agarwal et al. (2021) whether all decidable PAC learnable
classes are improperly CPAC learnable. Second, we consider undecidability of
(computable) PAC learnability. We give a simple and general argument to exhibit
such undecidability, and we initiate a study of the arithmetical complexity of
learnability. We briefly discuss the relation to the undecidability result of
Ben-David et al. (2019), that motivated the work of Agarwal et al.
Related papers
- On the Computability of Robust PAC Learning [9.507869508188266]
We introduce the problem of robust computable PAC (robust CPAC) learning.
We show that learnability in this setup is not implied by the combination of its components.
We prove that its finiteness is necessary, but not sufficient for robust CPAC learnability.
arXiv Detail & Related papers (2024-06-14T16:20:04Z) - Optimal Learners for Realizable Regression: PAC Learning and Online Learning [52.37726841759983]
We aim to characterize the statistical complexity of realizable regression both in the PAC learning setting and the online learning setting.
We first introduce a minimax instance optimal learner for realizable regression and propose a novel dimension that both qualitatively and quantitatively characterizes which classes of real-valued predictors are learnable.
In the context of online learning we provide a dimension that characterizes the minimax optimal instance optimal cumulative loss up to a constant factor and design an optimal online learner for realizable regression.
arXiv Detail & Related papers (2023-07-07T21:39:25Z) - Multiclass Boosting: Simple and Intuitive Weak Learning Criteria [72.71096438538254]
We give a simple and efficient boosting algorithm, that does not require realizability assumptions.
We present a new result on boosting for list learners, as well as provide a novel proof for the characterization of multiclass PAC learning.
arXiv Detail & Related papers (2023-07-02T19:26:58Z) - Find a witness or shatter: the landscape of computable PAC learning [63.26015736148707]
This paper contributes to the study of CPAC learnability by solving three open questions from recent papers.
Firstly, we prove that every CPAC learnable class is contained in a class which is properly CPAC learnable with a sample complexity.
Secondly, we show that there exists a decidable class of hypothesis which is properly learnable, but only with uncomputably fast growing sample complexity.
arXiv Detail & Related papers (2023-02-06T02:52:36Z) - On computable learning of continuous features [2.278415626131568]
We introduce definitions of computable PAC learning for binary classification over computable metric spaces.
We also give a presentation of a hypothesis class that does not admit any proper computable PAC learner with computable sample function.
arXiv Detail & Related papers (2021-11-24T02:28:21Z) - Semi-verified PAC Learning from the Crowd [7.594050968868919]
We study the problem of crowdsourced PAC learning of threshold functions.
We show that under the semi-verified model of Charikar et al., it is possible to PAC learn the underlying hypothesis class with a manageable amount of label queries.
arXiv Detail & Related papers (2021-06-13T20:05:16Z) - Sample-efficient proper PAC learning with approximate differential
privacy [51.09425023771246]
We prove that the sample complexity of properly learning a class of Littlestone dimension $d$ with approximate differential privacy is $tilde O(d6)$, ignoring privacy and accuracy parameters.
This result answers a question of Bun et al. (FOCS 2020) by improving upon their upper bound of $2O(d)$ on the sample complexity.
arXiv Detail & Related papers (2020-12-07T18:17:46Z) - Probably Approximately Correct Constrained Learning [135.48447120228658]
We develop a generalization theory based on the probably approximately correct (PAC) learning framework.
We show that imposing a learner does not make a learning problem harder in the sense that any PAC learnable class is also a constrained learner.
We analyze the properties of this solution and use it to illustrate how constrained learning can address problems in fair and robust classification.
arXiv Detail & Related papers (2020-06-09T19:59:29Z) - PACOH: Bayes-Optimal Meta-Learning with PAC-Guarantees [77.67258935234403]
We provide a theoretical analysis using the PAC-Bayesian framework and derive novel generalization bounds for meta-learning.
We develop a class of PAC-optimal meta-learning algorithms with performance guarantees and a principled meta-level regularization.
arXiv Detail & Related papers (2020-02-13T15:01:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.