Grover's Algorithm with Diffusion and Amplitude Steering
- URL: http://arxiv.org/abs/2110.11163v1
- Date: Thu, 21 Oct 2021 14:15:32 GMT
- Title: Grover's Algorithm with Diffusion and Amplitude Steering
- Authors: Robert L Singleton Jr, Michael L Rogers, and David L Ostby
- Abstract summary: We present a generalization of Grover's algorithm that searches an arbitrary subspace of the multi-dimensional Hilbert space.
We also outline a generalized Grover's algorithm that takes into account higher level correlations that could exist between database elements.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We review the basic theoretical underpinnings of the Grover algorithm,
providing a rigorous and well motivated derivation. We then present a
generalization of Grover's algorithm that searches an arbitrary subspace of the
multi-dimensional Hilbert space using a diffusion operation and an amplitude
amplification procedure that has been biased by unitary {\em steering
operators}. We also outline a generalized Grover's algorithm that takes into
account higher level correlations that could exist between database elements.
In the traditional Grover algorithm, the Hadamard gate selects a uniform sample
of computational basis elements when performing the phase selection and
diffusion. In contrast, steered operators bias the selection process, thereby
providing more flexibility in selecting the target state. Our method is a
generalization of the recently proposal pattern matching algorithm of Hiroyuki
et al..
Related papers
- Distributed exact multi-objective quantum search algorithm [9.09986332387569]
Grover's algorithm has quadratic acceleration in multi-objection search.
Iterated operator in Grover's algorithm is a key element and plays an important role in amplitude amplification.
arXiv Detail & Related papers (2024-09-06T06:16:53Z) - Bregman-divergence-based Arimoto-Blahut algorithm [53.64687146666141]
We generalize the Arimoto-Blahut algorithm to a general function defined over Bregman-divergence system.
We propose a convex-optimization-free algorithm that can be applied to classical and quantum rate-distortion theory.
arXiv Detail & Related papers (2024-08-10T06:16:24Z) - Amplitude Amplification for Optimization via Subdivided Phase Oracle [0.9176056742068814]
We propose an algorithm using a modified variant of amplitude amplification to solve optimization problems.
We show via numerical simulation that for normal, skew normal, and exponential distribution of objective values, the algorithm can be used to amplify the probability of measuring the optimal solution.
arXiv Detail & Related papers (2022-05-02T01:14:27Z) - Grover search revisited; application to image pattern matching [0.8367938108534343]
We propose a quantum algorithm that approximately executes the entire Grover database search or pattern matching algorithm.
The key idea is to use the recently proposed approximate amplitude encoding method on a shallow quantum circuit.
arXiv Detail & Related papers (2021-08-24T17:30:41Z) - Determinantal Beam Search [75.84501052642361]
Beam search is a go-to strategy for decoding neural sequence models.
In use-cases that call for multiple solutions, a diverse or representative set is often desired.
By posing iterations in beam search as a series of subdeterminant problems, we can turn the algorithm into a diverse subset selection process.
arXiv Detail & Related papers (2021-06-14T13:01:46Z) - Quantum Search with Prior Knowledge [15.384459603233978]
We propose a new generalization of Grover's search algorithm which performs better than the standard Grover algorithm in average under this setting.
We prove that our new algorithm achieves the optimal expected success probability of finding the solution if the number of queries is fixed.
arXiv Detail & Related papers (2020-09-18T09:50:33Z) - Accelerated Message Passing for Entropy-Regularized MAP Inference [89.15658822319928]
Maximum a posteriori (MAP) inference in discrete-valued random fields is a fundamental problem in machine learning.
Due to the difficulty of this problem, linear programming (LP) relaxations are commonly used to derive specialized message passing algorithms.
We present randomized methods for accelerating these algorithms by leveraging techniques that underlie classical accelerated gradient.
arXiv Detail & Related papers (2020-07-01T18:43:32Z) - An Empirical Process Approach to the Union Bound: Practical Algorithms
for Combinatorial and Linear Bandits [34.06611065493047]
This paper proposes near-optimal algorithms for the pure-exploration linear bandit problem in the fixed confidence and fixed budget settings.
We provide an algorithm whose sample complexity scales with the geometry of the instance and avoids an explicit union bound over the number of arms.
We also propose the first algorithm for linear bandits in the the fixed budget setting.
arXiv Detail & Related papers (2020-06-21T00:56:33Z) - Nearly Linear Row Sampling Algorithm for Quantile Regression [54.75919082407094]
We give a row sampling algorithm for the quantile loss function with sample complexity nearly linear in the dimensionality of the data.
Based upon our row sampling algorithm, we give the fastest known algorithm for quantile regression and a graph sparsification algorithm for balanced directed graphs.
arXiv Detail & Related papers (2020-06-15T13:40:07Z) - Optimal Randomized First-Order Methods for Least-Squares Problems [56.05635751529922]
This class of algorithms encompasses several randomized methods among the fastest solvers for least-squares problems.
We focus on two classical embeddings, namely, Gaussian projections and subsampled Hadamard transforms.
Our resulting algorithm yields the best complexity known for solving least-squares problems with no condition number dependence.
arXiv Detail & Related papers (2020-02-21T17:45:32Z) - Optimal Iterative Sketching with the Subsampled Randomized Hadamard
Transform [64.90148466525754]
We study the performance of iterative sketching for least-squares problems.
We show that the convergence rate for Haar and randomized Hadamard matrices are identical, andally improve upon random projections.
These techniques may be applied to other algorithms that employ randomized dimension reduction.
arXiv Detail & Related papers (2020-02-03T16:17:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.