Efficient Algorithms for Learning and Compressing Monophonic Halfspaces in Graphs
- URL: http://arxiv.org/abs/2506.23186v1
- Date: Sun, 29 Jun 2025 11:14:16 GMT
- Title: Efficient Algorithms for Learning and Compressing Monophonic Halfspaces in Graphs
- Authors: Marco Bressan, Victor Chepoi, Emmanuel Esposito, Maximilian Thiessen,
- Abstract summary: We study monophonic halfspaces, a notion of graph halfspaces defined through closure under induced paths.<n>Our results answer open questions from the literature, and show a stark contrast with geodesic halfspaces, for which most of the said learning problems are NP-hard.
- Score: 6.974741712647656
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Abstract notions of convexity over the vertices of a graph, and corresponding notions of halfspaces, have recently gained attention from the machine learning community. In this work we study monophonic halfspaces, a notion of graph halfspaces defined through closure under induced paths. Our main result is a $2$-satisfiability based decomposition theorem, which allows one to represent monophonic halfspaces as a disjoint union of certain vertex subsets. Using this decomposition, we achieve efficient and (nearly) optimal algorithms for various learning problems, such as teaching, active, and online learning. Most notably, we obtain a polynomial-time algorithm for empirical risk minimization. Independently of the decomposition theorem, we obtain an efficient, stable, and proper sample compression scheme. This makes monophonic halfspaces efficiently learnable with proper learners and linear error rate $1/\varepsilon$ in the realizable PAC setting. Our results answer open questions from the literature, and show a stark contrast with geodesic halfspaces, for which most of the said learning problems are NP-hard.
Related papers
- Efficient Algorithms for Learning Monophonic Halfspaces in Graphs [7.619404259039284]
We prove several novel results for learning monophonic halfspaces in the supervised, online, and active settings.
Our main result is that a monophonic halfspace can be learned with near-optimal complexity in time in $n = |V(G)|$.
We also show that the concept class can be enumerated with delay $operatornamepoly(n)$, and that empirical risk can be performed in time $2omega(G)operatornamepoly(n)$.
arXiv Detail & Related papers (2024-05-01T20:34:12Z) - Minimax Optimization: The Case of Convex-Submodular [50.03984152441271]
Minimax problems extend beyond the continuous domain to mixed continuous-discrete domains or even fully discrete domains.
We introduce the class of convex-submodular minimax problems, where the objective is convex with respect to the continuous variable and submodular with respect to the discrete variable.
Our proposed algorithms are iterative and combine tools from both discrete and continuous optimization.
arXiv Detail & Related papers (2021-11-01T21:06:35Z) - Learning Weakly Convex Sets in Metric Spaces [2.0618817976970103]
A central problem in the theory of machine learning is whether it is possible to efficiently find a consistent hypothesis i.e. which has zero error.
We show that the general idea of our algorithm can even be extended to the case of weakly convex hypotheses.
arXiv Detail & Related papers (2021-05-10T23:00:02Z) - Agnostic Proper Learning of Halfspaces under Gaussian Marginals [56.01192577666607]
We study the problem of agnostically learning halfspaces under the Gaussian.
Our main result is the em first proper learning algorithm for this problem.
arXiv Detail & Related papers (2021-02-10T18:40:44Z) - Hardness of Learning Halfspaces with Massart Noise [56.98280399449707]
We study the complexity of PAC learning halfspaces in the presence of Massart (bounded) noise.
We show that there an exponential gap between the information-theoretically optimal error and the best error that can be achieved by a SQ algorithm.
arXiv Detail & Related papers (2020-12-17T16:43:11Z) - A Polynomial Time Algorithm for Learning Halfspaces with Tsybakov Noise [55.45544638410858]
We study the problem of PAC learning homogeneous halfspaces in the presence of Tsybakov noise.
Our algorithm learns the true halfspace within any accuracy $epsilon$.
arXiv Detail & Related papers (2020-10-04T22:19:06Z) - Learning Halfspaces with Tsybakov Noise [50.659479930171585]
We study the learnability of halfspaces in the presence of Tsybakov noise.
We give an algorithm that achieves misclassification error $epsilon$ with respect to the true halfspace.
arXiv Detail & Related papers (2020-06-11T14:25:02Z) - Classification Under Misspecification: Halfspaces, Generalized Linear
Models, and Connections to Evolvability [39.01599245403753]
In particular, we study the problem of learning halfspaces under Massart noise with rate $eta$.
We show any SQ algorithm requires super-polynomially many queries to achieve $mathsfOPT + epsilon$.
We also study our algorithm for learning halfspaces under Massart noise empirically and find that it exhibits some appealing fairness properties.
arXiv Detail & Related papers (2020-06-08T17:59:11Z) - Learning Halfspaces with Massart Noise Under Structured Distributions [46.37328271312332]
We solve the problem of learningspaces with Massart noise in the distribution-specific PAC model.
We give the first efficient algorithm for a broad family problem with respect to log-con.
arXiv Detail & Related papers (2020-02-13T17:02:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.