QuantumBoost: A lazy, yet fast, quantum algorithm for learning with weak hypotheses
- URL: http://arxiv.org/abs/2510.05089v1
- Date: Mon, 06 Oct 2025 17:56:05 GMT
- Title: QuantumBoost: A lazy, yet fast, quantum algorithm for learning with weak hypotheses
- Authors: Amira Abbas, Yanlin Chen, Tuyen Nguyen, Ronald de Wolf,
- Abstract summary: We introduce QuantumBoost, an algorithm for improving decision quality in machine learning.<n>First, it uses a quantum algorithm to compute approximate Bregman projections faster.<n>Second, it combines this with a lazy projection strategy.<n>To our knowledge, QuantumBoost is the first algorithm, classical or quantum, to successfully adopt a lazy projection strategy in the context of boosting.
- Score: 2.072478807901501
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The technique of combining multiple votes to enhance the quality of a decision is the core of boosting algorithms in machine learning. In particular, boosting provably increases decision quality by combining multiple weak learners-hypotheses that are only slightly better than random guessing-into a single strong learner that classifies data well. There exist various versions of boosting algorithms, which we improve upon through the introduction of QuantumBoost. Inspired by classical work by Barak, Hardt and Kale, our QuantumBoost algorithm achieves the best known runtime over other boosting methods through two innovations. First, it uses a quantum algorithm to compute approximate Bregman projections faster. Second, it combines this with a lazy projection strategy, a technique from convex optimization where projections are performed infrequently rather than every iteration. To our knowledge, QuantumBoost is the first algorithm, classical or quantum, to successfully adopt a lazy projection strategy in the context of boosting.
Related papers
- How to Boost Any Loss Function [63.573324901948716]
We show that any loss function can be optimized with boosting.
We also show that boosting can achieve a feat not yet known to be possible in the classical $0th$ order setting.
arXiv Detail & Related papers (2024-07-02T14:08:23Z) - Quantum Subroutine for Variance Estimation: Algorithmic Design and Applications [80.04533958880862]
Quantum computing sets the foundation for new ways of designing algorithms.
New challenges arise concerning which field quantum speedup can be achieved.
Looking for the design of quantum subroutines that are more efficient than their classical counterpart poses solid pillars to new powerful quantum algorithms.
arXiv Detail & Related papers (2024-02-26T09:32:07Z) - Supervised Learning Guarantee for Quantum AdaBoost [7.473180902609473]
In the noisy intermediate-scale quantum (NISQ) era, variational quantum algorithms are constrained due to a limited number of qubits and the shallow depth of quantum circuits.
In this paper, we theoretically establish and numerically verify a learning guarantee for quantum adaptive boosting (AdaBoost)
Our work indicates that in the current NISQ era, introducing appropriate ensemble methods is particularly valuable in improving the performance of quantum machine learning algorithms.
arXiv Detail & Related papers (2024-02-04T07:18:44Z) - Quantum Clustering with k-Means: a Hybrid Approach [117.4705494502186]
We design, implement, and evaluate three hybrid quantum k-Means algorithms.
We exploit quantum phenomena to speed up the computation of distances.
We show that our hybrid quantum k-Means algorithms can be more efficient than the classical version.
arXiv Detail & Related papers (2022-12-13T16:04:16Z) - Efficient Quantum Agnostic Improper Learning of Decision Trees [0.18416014644193066]
We give a poly agnostic(n,t,frac1varepsilon)$ quantum algorithm for learning size $t$ decision trees with uniform marginal over instances.
Our algorithm is the first algorithm (classical or quantum) for learning decision trees in time without membership queries.
arXiv Detail & Related papers (2022-10-01T07:11:19Z) - Entanglement and coherence in Bernstein-Vazirani algorithm [58.720142291102135]
Bernstein-Vazirani algorithm allows one to determine a bit string encoded into an oracle.
We analyze in detail the quantum resources in the Bernstein-Vazirani algorithm.
We show that in the absence of entanglement, the performance of the algorithm is directly related to the amount of quantum coherence in the initial state.
arXiv Detail & Related papers (2022-05-26T20:32:36Z) - Quantum Boosting using Domain-Partitioning Hypotheses [0.9264464791978363]
Boosting is an ensemble learning method that converts a weak learner into a strong learner in the PAC learning framework.
We show that Q-RealBoost provides a speedup over Q-AdaBoost in terms of both the bias of the weak learner and the time taken by the weak learner to learn the target concept class.
arXiv Detail & Related papers (2021-10-25T10:46:13Z) - Provably Faster Algorithms for Bilevel Optimization [54.83583213812667]
Bilevel optimization has been widely applied in many important machine learning applications.
We propose two new algorithms for bilevel optimization.
We show that both algorithms achieve the complexity of $mathcalO(epsilon-1.5)$, which outperforms all existing algorithms by the order of magnitude.
arXiv Detail & Related papers (2021-06-08T21:05:30Z) - Quantum Inspired Adaptive Boosting [0.0]
We show that the quantum ensemble method does not have advantage over classical algorithms.
We propose methods inspired by combining the quantum ensemble method with adaptive boosting.
The algorithms were tested and found to be comparable to the AdaBoost algorithm on publicly available data sets.
arXiv Detail & Related papers (2021-02-01T16:33:14Z) - Improved Quantum Boosting [1.0660480034605238]
Arunachalam and Maity gave the first quantum improvement for boosting, by combining Freund and Schapire's AdaBoost algorithm with a quantum algorithm for approximate counting.
In this paper we give a substantially faster and simpler quantum boosting algorithm, based on Servedio's SmoothBoost algorithm.
arXiv Detail & Related papers (2020-09-17T15:16:42Z) - On the Dual Formulation of Boosting Algorithms [92.74617630106559]
We show that the Lagrange problems of AdaBoost, LogitBoost and soft-marginBoost are all dual problems with generalized hinge loss entropy.
By looking at the dual problems of these boosting algorithms, we show that the success of boosting can be understood in terms of maintaining a better margin distribution.
arXiv Detail & Related papers (2009-01-23T02:14:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.