A Computational Separation between Private Learning and Online Learning
- URL: http://arxiv.org/abs/2007.05665v1
- Date: Sat, 11 Jul 2020 02:41:54 GMT
- Title: A Computational Separation between Private Learning and Online Learning
- Authors: Mark Bun
- Abstract summary: A concept class is privately learnable if and only if it is online learnable.
This equivalence incurs significant losses in both sample and computational efficiency.
We show that, assuming the existence of one-way functions, such an efficient conversion is impossible for pureprivate learners with sample complexity.
- Score: 19.001036556917818
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A recent line of work has shown a qualitative equivalence between
differentially private PAC learning and online learning: A concept class is
privately learnable if and only if it is online learnable with a finite mistake
bound. However, both directions of this equivalence incur significant losses in
both sample and computational efficiency. Studying a special case of this
connection, Gonen, Hazan, and Moran (NeurIPS 2019) showed that uniform or
highly sample-efficient pure-private learners can be time-efficiently compiled
into online learners. We show that, assuming the existence of one-way
functions, such an efficient conversion is impossible even for general
pure-private learners with polynomial sample complexity. This resolves a
question of Neel, Roth, and Wu (FOCS 2019).
Related papers
- Probabilistic Contrastive Learning for Long-Tailed Visual Recognition [78.70453964041718]
Longtailed distributions frequently emerge in real-world data, where a large number of minority categories contain a limited number of samples.
Recent investigations have revealed that supervised contrastive learning exhibits promising potential in alleviating the data imbalance.
We propose a novel probabilistic contrastive (ProCo) learning algorithm that estimates the data distribution of the samples from each class in the feature space.
arXiv Detail & Related papers (2024-03-11T13:44:49Z) - Private PAC Learning May be Harder than Online Learning [14.180842831990999]
We show that any concept class of Littlestone dimension $d$ can be privately PAC learned using $mathrmpoly(d)$ samples.
This raises the natural question of whether there might be a generic conversion from online learners to private PAC learners.
arXiv Detail & Related papers (2024-02-16T22:44:52Z) - Langevin Unlearning: A New Perspective of Noisy Gradient Descent for Machine Unlearning [20.546589699647416]
Privacy is defined as statistical indistinguishability to retraining from scratch.
We propose Langevin unlearning, an unlearning framework based on a gradient descent.
arXiv Detail & Related papers (2024-01-18T20:35:47Z) - On Differentially Private Online Predictions [74.01773626153098]
We introduce an interactive variant of joint differential privacy towards handling online processes.
We demonstrate that it satisfies (suitable variants) of group privacy, composition, and post processing.
We then study the cost of interactive joint privacy in the basic setting of online classification.
arXiv Detail & Related papers (2023-02-27T19:18:01Z) - Learning versus Refutation in Noninteractive Local Differential Privacy [133.80204506727526]
We study two basic statistical tasks in non-interactive local differential privacy (LDP): learning and refutation.
Our main result is a complete characterization of the sample complexity of PAC learning for non-interactive LDP protocols.
arXiv Detail & Related papers (2022-10-26T03:19:24Z) - Simple Stochastic and Online Gradient DescentAlgorithms for Pairwise
Learning [65.54757265434465]
Pairwise learning refers to learning tasks where the loss function depends on a pair instances.
Online descent (OGD) is a popular approach to handle streaming data in pairwise learning.
In this paper, we propose simple and online descent to methods for pairwise learning.
arXiv Detail & Related papers (2021-11-23T18:10:48Z) - Multiclass versus Binary Differentially Private PAC Learning [32.22526322514124]
We show a generic reduction from multiclass differentially private PAC learning to binary private PAC learning.
Our proof extends the notion of $Psi$-dimension defined in work of Ben-David et al. [JCSS '95] to the online setting and explores its general properties.
arXiv Detail & Related papers (2021-07-22T18:06:39Z) - Sample-efficient proper PAC learning with approximate differential
privacy [51.09425023771246]
We prove that the sample complexity of properly learning a class of Littlestone dimension $d$ with approximate differential privacy is $tilde O(d6)$, ignoring privacy and accuracy parameters.
This result answers a question of Bun et al. (FOCS 2020) by improving upon their upper bound of $2O(d)$ on the sample complexity.
arXiv Detail & Related papers (2020-12-07T18:17:46Z) - On the Equivalence between Online and Private Learnability beyond Binary
Classification [26.400891660337777]
We show that private learnability implies online learnability in both settings.
We show that while online learnability continues to imply private learnability in multi-class classification, current proof techniques encounter significant hurdles in the regression setting.
arXiv Detail & Related papers (2020-06-02T23:30:41Z) - An Equivalence Between Private Classification and Online Prediction [43.37646702577754]
We prove that every concept class with finite Littlestone dimension can be learned by an (approximate) differentially-private algorithm.
We introduce a new notion of stability called "global stability" which is essential to our proof and may be of independent interest.
arXiv Detail & Related papers (2020-03-01T19:20:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.