Comparing quantum and classical Monte Carlo algorithms for estimating Betti numbers of clique complexes
- URL: http://arxiv.org/abs/2408.16934v1
- Date: Thu, 29 Aug 2024 22:35:50 GMT
- Title: Comparing quantum and classical Monte Carlo algorithms for estimating Betti numbers of clique complexes
- Authors: Ismail Yunus Akhalwaya, Ahmed Bhayat, Adam Connolly, Steven Herbert, Lior Horesh, Julien Sorci, Shashanka Ubaru,
- Abstract summary: Several quantum and classical Monte Carlo algorithms for Betti Number Estimation (BNE) on clique complexes have recently been proposed.
We review these algorithms, emphasising their common Monte Carlo structure within a new modular framework.
By recombining the different modules, we create a new quantum algorithm with an exponentially-improved dependence in the sample complexity.
- Score: 6.713479655907349
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Several quantum and classical Monte Carlo algorithms for Betti Number Estimation (BNE) on clique complexes have recently been proposed, though it is unclear how their performances compare. We review these algorithms, emphasising their common Monte Carlo structure within a new modular framework. This framework allows us to directly compare these algorithms by calculating upper bounds on the minimum number of samples needed for convergence. By recombining the different modules, we create a new quantum algorithm with an exponentially-improved dependence in the sample complexity. We run classical simulations to verify convergence within the theoretical bounds and observe the predicted exponential separation, even though empirical convergence occurs substantially earlier than the conservative theoretical bounds.
Related papers
- Quartic quantum speedups for community detection [84.14713515477784]
We develop a quantum algorithm for hypergraph community detection that achieves a quartic quantum speedup.<n>Our algorithm is based on the Kikuchi method, which we extend beyond previously considered problems such as PCA and $p$XORSAT.
arXiv Detail & Related papers (2025-10-09T17:35:17Z) - HARLI CQUINN: Higher Adjusted Randomness with Linear In Complexity QUantum INspired Networks for K-Means [0.0]
We show a quantum performance on and above par with the classical k-means algorithm.<n>We present benchmarks of its accuracy for test cases of both well-known and experimental datasets.
arXiv Detail & Related papers (2025-09-23T02:32:13Z) - Analyzing and improving a classical Betti number estimation algorithm [0.0]
A classical algorithm for estimating the normalized Betti number of an arbitrary simplicial complex was proposed.<n>Motivated by a quantum algorithm with a similar Monte Carlo structure and improved sample complexity, we give a more in-depth analysis of the sample complexity of this classical algorithm.<n>We show that for certain models our improvement almost always leads to a reduced sample complexity, and also produce separate regimes where the sample complexity for both algorithms is exponential.
arXiv Detail & Related papers (2025-09-19T17:29:57Z) - New aspects of quantum topological data analysis: Betti number estimation, and testing and tracking of homology and cohomology classes [3.2268950104324965]
We present new quantum algorithms for estimating homological invariants, specifically Betti and persistent Betti numbers.<n>Our approach efficiently block-encodings of constructs (persistent) Laplacians, enabling estimation via exponential rank methods.<n>These results open a new direction in quantum topological data analysis and demonstrate provable quantum advantages in computing topological invariants.
arXiv Detail & Related papers (2025-06-02T08:43:58Z) - Non-linear Quantum Monte Carlo [1.237454174824584]
Quantum computing provides a quadratic speedup over classical Monte Carlo methods for mean estimation.
We propose a quantum-inside-quantum Monte Carlo algorithm that achieves such a speedup for a broad class of non-linear estimation problems.
arXiv Detail & Related papers (2025-02-07T17:13:27Z) - An Efficient Algorithm for Clustered Multi-Task Compressive Sensing [60.70532293880842]
Clustered multi-task compressive sensing is a hierarchical model that solves multiple compressive sensing tasks.
The existing inference algorithm for this model is computationally expensive and does not scale well in high dimensions.
We propose a new algorithm that substantially accelerates model inference by avoiding the need to explicitly compute these covariance matrices.
arXiv Detail & Related papers (2023-09-30T15:57:14Z) - Approximation of group explainers with coalition structure using Monte Carlo sampling on the product space of coalitions and features [0.11184789007828977]
We focus on a wide class of linear game values, as well as coalitional values, for the marginal game based on a given ML model and predictor vector.
We design a novel Monte Carlo sampling algorithm that estimates them at a reduced complexity that depends linearly on the size of the background dataset.
arXiv Detail & Related papers (2023-03-17T19:17:06Z) - Quantum Clustering with k-Means: a Hybrid Approach [117.4705494502186]
We design, implement, and evaluate three hybrid quantum k-Means algorithms.
We exploit quantum phenomena to speed up the computation of distances.
We show that our hybrid quantum k-Means algorithms can be more efficient than the classical version.
arXiv Detail & Related papers (2022-12-13T16:04:16Z) - Quantum Adversarial Learning in Emulation of Monte-Carlo Methods for
Max-cut Approximation: QAOA is not optimal [0.0]
We apply notions of emulation to Variational Quantum Annealing and the Quantum Approximate Algorithm Optimization (QAOA)
Our Variational Quantum Annealing schedule is based on a novel parameterization that can be optimized in a similar gradient-free way as QAOA, using the same physical ingredients.
In order to compare the performance of ans"atze types, we have developed statistical notions of Monte-Carlo methods.
arXiv Detail & Related papers (2022-11-24T19:02:50Z) - Optimal Algorithms for Stochastic Complementary Composite Minimization [55.26935605535377]
Inspired by regularization techniques in statistics and machine learning, we study complementary composite minimization.
We provide novel excess risk bounds, both in expectation and with high probability.
Our algorithms are nearly optimal, which we prove via novel lower complexity bounds for this class of problems.
arXiv Detail & Related papers (2022-11-03T12:40:24Z) - Stochastic normalizing flows as non-equilibrium transformations [62.997667081978825]
We show that normalizing flows provide a route to sample lattice field theories more efficiently than conventional MonteCarlo simulations.
We lay out a strategy to optimize the efficiency of this extended class of generative models and present examples of applications.
arXiv Detail & Related papers (2022-01-21T19:00:18Z) - Quantum algorithm for stochastic optimal stopping problems with
applications in finance [60.54699116238087]
The famous least squares Monte Carlo (LSM) algorithm combines linear least square regression with Monte Carlo simulation to approximately solve problems in optimal stopping theory.
We propose a quantum LSM based on quantum access to a process, on quantum circuits for computing the optimal stopping times, and on quantum techniques for Monte Carlo.
arXiv Detail & Related papers (2021-11-30T12:21:41Z) - Finite-Bit Quantization For Distributed Algorithms With Linear
Convergence [6.293059137498172]
We study distributed algorithms for (strongly convex) composite optimization problems over mesh networks subject to quantized communications.
A new quantizer coupled with a communication-efficient encoding scheme is proposed, which efficiently implements the Biased Compression (BC-)rule.
Numerical results validate our theoretical findings and show that distributed algorithms equipped with the proposed quantizer have more favorable communication complexity than algorithms using existing quantization rules.
arXiv Detail & Related papers (2021-07-23T15:31:31Z) - Sampling Permutations for Shapley Value Estimation [1.0323063834827415]
Game-theoretic attribution techniques based on Shapley values are used extensively to interpret machine learning models.
As the computation of Shapley values can be expressed as a summation over a set of permutations, a common approach is to sample a subset of these permutations for approximation.
Unfortunately, standard Monte Carlo sampling methods can exhibit slow convergence, and more sophisticated quasi Monte Carlo methods are not well defined on the space of permutations.
arXiv Detail & Related papers (2021-04-25T16:44:18Z) - Quantum-Inspired Algorithms from Randomized Numerical Linear Algebra [53.46106569419296]
We create classical (non-quantum) dynamic data structures supporting queries for recommender systems and least-squares regression.
We argue that the previous quantum-inspired algorithms for these problems are doing leverage or ridge-leverage score sampling in disguise.
arXiv Detail & Related papers (2020-11-09T01:13:07Z) - Subgroup-based Rank-1 Lattice Quasi-Monte Carlo [51.877197074344586]
We propose a simple closed-form rank-1 lattice construction method based on group theory.
Our method achieves superior approximation performance on benchmark integration test problems and kernel approximation problems.
arXiv Detail & Related papers (2020-10-29T03:42:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.