On the Depth between Beam Search and Exhaustive Search for Text
Generation
- URL: http://arxiv.org/abs/2308.13696v1
- Date: Fri, 25 Aug 2023 22:57:53 GMT
- Title: On the Depth between Beam Search and Exhaustive Search for Text
Generation
- Authors: Yuu Jinnai, Tetsuro Morimura, Ukyo Honda
- Abstract summary: Beam search is limited in both search width and depth, whereas exhaustive search is a global search that has no such limitations.
We introduce Lookahead Beam Search (LBS), a multi-step lookahead search that optimize the objective considering a fixed number of future steps.
LBS outperforms beam search overall on machine translation tasks.
- Score: 3.726173629675064
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Beam search and exhaustive search are two extreme ends of text decoding
algorithms with respect to the search depth. Beam search is limited in both
search width and depth, whereas exhaustive search is a global search that has
no such limitations. Surprisingly, beam search is not only computationally
cheaper but also performs better than exhaustive search despite its higher
search error. Plenty of research has investigated a range of beam widths, from
small to large, and reported that a beam width that is neither too large nor
too small is desirable. However, in terms of search depth, only the two extreme
ends, beam search and exhaustive search are studied intensively. In this paper,
we examine a range of search depths between the two extremes to discover the
desirable search depth. To this end, we introduce Lookahead Beam Search (LBS),
a multi-step lookahead search that optimizes the objective considering a fixed
number of future steps. Beam search and exhaustive search are special cases of
LBS where the lookahead depth is set to $0$ and $\infty$, respectively. We
empirically evaluate the performance of LBS and find that it outperforms beam
search overall on machine translation tasks. The result suggests there is room
for improvement in beam search by searching deeper. Inspired by the analysis,
we propose Lookbehind Heuristic Beam Search, a computationally feasible search
algorithm that heuristically simulates LBS with 1-step lookahead. The empirical
results show that the proposed method outperforms vanilla beam search on
machine translation and text summarization tasks.
Related papers
- Rectangle Search: An Anytime Beam Search (Extended Version) [9.59799149404787]
Anytime search algorithms try to find a (potentially suboptimal) solution as quickly as possible.
We propose a new algorithm, rectangle search, that is instead based on beam search.
arXiv Detail & Related papers (2023-12-19T19:50:45Z) - Hybrid Search for Efficient Planning with Completeness Guarantees [63.02803974708516]
We propose an efficient approach to augment a subgoal search method to achieve completeness in discrete action spaces.
This solution achieves the best of both worlds: the practical efficiency of high-level search and the completeness of low-level search.
arXiv Detail & Related papers (2023-10-19T15:16:43Z) - A Call for Clarity in Beam Search: How It Works and When It Stops [125.55175954381991]
We introduce a patience factor, a simple modification to this beam decoding implementation, that generalizes the stopping criterion and provides flexibility to the depth of search.
Empirical results demonstrate that adjusting this patience factor improves decoding performance of strong pretrained models on news text summarization and machine translation over diverse language pairs.
arXiv Detail & Related papers (2022-04-11T22:03:44Z) - Beam Search: Faster and Monotonic [15.20931404997906]
We show how to make beam search monotonic; that is, we provide a new variant that guarantees non-increasing solution cost as the beam width is increased.
We also show how using distance-to-go estimates can allow beam search to find better solutions more quickly in domains with non-uniform costs.
arXiv Detail & Related papers (2022-04-06T16:40:13Z) - CrossBeam: Learning to Search in Bottom-Up Program Synthesis [51.37514793318815]
We propose training a neural model to learn a hands-on search policy for bottom-up synthesis.
Our approach, called CrossBeam, uses the neural model to choose how to combine previously-explored programs into new programs.
We observe that CrossBeam learns to search efficiently, exploring much smaller portions of the program space compared to the state-of-the-art.
arXiv Detail & Related papers (2022-03-20T04:41:05Z) - Exploring Complicated Search Spaces with Interleaving-Free Sampling [127.07551427957362]
In this paper, we build the search algorithm upon a complicated search space with long-distance connections.
We present a simple yet effective algorithm named textbfIF-NAS, where we perform a periodic sampling strategy to construct different sub-networks.
In the proposed search space, IF-NAS outperform both random sampling and previous weight-sharing search algorithms by a significant margin.
arXiv Detail & Related papers (2021-12-05T06:42:48Z) - What Do You Get When You Cross Beam Search with Nucleus Sampling? [23.5360917879657]
We create two deterministic nucleus search algorithms for natural language generation.
Experiments on machine translation and summarization benchmarks show that both algorithms reach the same performance levels as standard beam search.
arXiv Detail & Related papers (2021-07-20T18:59:14Z) - If beam search is the answer, what was the question? [78.71330480725668]
We find that beam search enforces uniform information density in text, a property motivated by cognitive science.
We suggest a set of decoding objectives that explicitly enforce this property and find that exact decoding with these objectives alleviates the problems encountered when decoding poorly calibrated language generation models.
arXiv Detail & Related papers (2020-10-06T11:57:03Z) - Searching for a Search Method: Benchmarking Search Algorithms for
Generating NLP Adversarial Examples [10.993342896547691]
We study the behavior of several black-box search algorithms used for generating adversarial examples for natural language processing (NLP) tasks.
We perform a fine-grained analysis of three elements relevant to search: search algorithm, search space, and search budget.
arXiv Detail & Related papers (2020-09-09T17:04:42Z) - Best-First Beam Search [78.71330480725668]
We show that the standard implementation of beam search can be made up to 10x faster in practice.
We propose a memory-reduced variant of Best-First Beam Search, which has a similar beneficial search bias in terms of downstream performance.
arXiv Detail & Related papers (2020-07-08T05:56:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.