A Strongly Polynomial Algorithm for Approximate Forster Transforms and
its Application to Halfspace Learning
- URL: http://arxiv.org/abs/2212.03008v1
- Date: Tue, 6 Dec 2022 14:32:02 GMT
- Title: A Strongly Polynomial Algorithm for Approximate Forster Transforms and
its Application to Halfspace Learning
- Authors: Ilias Diakonikolas and Christos Tzamos and Daniel M. Kane
- Abstract summary: The Forster transform is a method of regularizing a dataset by placing it in em radial isotropic position while maintaining some of its essential properties.
- Score: 56.86097988183311
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The Forster transform is a method of regularizing a dataset by placing it in
{\em radial isotropic position} while maintaining some of its essential
properties. Forster transforms have played a key role in a diverse range of
settings spanning computer science and functional analysis. Prior work had
given {\em weakly} polynomial time algorithms for computing Forster transforms,
when they exist. Our main result is the first {\em strongly polynomial time}
algorithm to compute an approximate Forster transform of a given dataset or
certify that no such transformation exists. By leveraging our strongly
polynomial Forster algorithm, we obtain the first strongly polynomial time
algorithm for {\em distribution-free} PAC learning of halfspaces. This learning
result is surprising because {\em proper} PAC learning of halfspaces is {\em
equivalent} to linear programming. Our learning approach extends to give a
strongly polynomial halfspace learner in the presence of random classification
noise and, more generally, Massart noise.
Related papers
- Learning Polynomial Transformations [41.30404402331856]
We consider the problem of learning high dimensional quadratic transformations of Gaussians.
Our results extend to any-invariant input distribution, not just Gaussian.
We also give the first decomposition-time algorithms with provable guarantees for tensor ring decomposition.
arXiv Detail & Related papers (2022-04-08T17:59:31Z) - Forster Decomposition and Learning Halfspaces with Noise [60.691817861402676]
A Forster transform is an operation that turns a distribution into one with good anti-concentration properties.
We show that any distribution can be decomposed efficiently as a disjoint mixture of few distributions for which a Forster transform exists and can be computed efficiently.
arXiv Detail & Related papers (2021-07-12T17:00:59Z) - A Reinforcement Learning Environment for Polyhedral Optimizations [68.8204255655161]
We propose a shape-agnostic formulation for the space of legal transformations in the polyhedral model as a Markov Decision Process (MDP)
Instead of using transformations, the formulation is based on an abstract space of possible schedules.
Our generic MDP formulation enables using reinforcement learning to learn optimization policies over a wide range of loops.
arXiv Detail & Related papers (2021-04-28T12:41:52Z) - Metric Transforms and Low Rank Matrices via Representation Theory of the
Real Hyperrectangle [17.808087068037985]
We show how to compute the eigenvectors and eigenvalues of certain matrices arising from hyperrectangles.
We then use our new technique along with these connections to prove several new structural results.
arXiv Detail & Related papers (2020-11-23T16:03:12Z) - A Polynomial Time Algorithm for Learning Halfspaces with Tsybakov Noise [55.45544638410858]
We study the problem of PAC learning homogeneous halfspaces in the presence of Tsybakov noise.
Our algorithm learns the true halfspace within any accuracy $epsilon$.
arXiv Detail & Related papers (2020-10-04T22:19:06Z) - Learning Halfspaces with Tsybakov Noise [50.659479930171585]
We study the learnability of halfspaces in the presence of Tsybakov noise.
We give an algorithm that achieves misclassification error $epsilon$ with respect to the true halfspace.
arXiv Detail & Related papers (2020-06-11T14:25:02Z) - Quantum Gradient Algorithm for General Polynomials [5.008814514502094]
Gradient-based algorithms, popular strategies to optimization problems, are essential for many modern machine-learning techniques.
We propose a quantum gradient algorithm for optimizing numericals with the dressed amplitude.
For the potential values in high-dimension optimizations, this quantum algorithm is supposed to facilitate gradient-optimizations.
arXiv Detail & Related papers (2020-04-23T11:28:45Z) - Sketching Transformed Matrices with Applications to Natural Language
Processing [76.6222695417524]
We propose a space-efficient sketching algorithm for computing the product of a given small matrix with the transformed matrix.
We show that our approach obtains small error and is efficient in both space and time.
arXiv Detail & Related papers (2020-02-23T03:07:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.