A Systematic Characterization of Sampling Algorithms for Open-ended
Language Generation
- URL: http://arxiv.org/abs/2009.07243v1
- Date: Tue, 15 Sep 2020 17:28:42 GMT
- Title: A Systematic Characterization of Sampling Algorithms for Open-ended
Language Generation
- Authors: Moin Nadeem, Tianxing He, Kyunghyun Cho, James Glass
- Abstract summary: We study the widely adopted ancestral sampling algorithms for auto-regressive language models.
We identify three key properties that are shared among them: entropy reduction, order preservation, and slope preservation.
We find that the set of sampling algorithms that satisfies these properties performs on par with the existing sampling algorithms.
- Score: 71.31905141672529
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This work studies the widely adopted ancestral sampling algorithms for
auto-regressive language models, which is not widely studied in the literature.
We use the quality-diversity (Q-D) trade-off to investigate three popular
sampling algorithms (top-k, nucleus and tempered sampling). We focus on the
task of open-ended language generation. We first show that the existing
sampling algorithms have similar performance. After carefully inspecting the
transformations defined by different sampling algorithms, we identify three key
properties that are shared among them: entropy reduction, order preservation,
and slope preservation. To validate the importance of the identified
properties, we design two sets of new sampling algorithms: one set in which
each algorithm satisfies all three properties, and one set in which each
algorithm violates at least one of the properties. We compare their performance
with existing sampling algorithms, and find that violating the identified
properties could lead to drastic performance degradation, as measured by the
Q-D trade-off. On the other hand, we find that the set of sampling algorithms
that satisfies these properties performs on par with the existing sampling
algorithms. Our data and code are available at
https://github.com/moinnadeem/characterizing-sampling-algorithms
Related papers
- A Three-Stage Algorithm for the Closest String Problem on Artificial and Real Gene Sequences [39.58317527488534]
Closest String Problem is an NP-hard problem that aims to find a string that has the minimum distance from all sequences that belong to the given set of strings.
In this paper, we introduce a three-stage algorithm that comprises the following process: first, we apply a novel alphabet pruning method to reduce the search space for effectively finding promising search regions.
Second, a variant of beam search to find a solution is employed. This method utilizes a newly developed guiding function based on an expected distance score of partial solutions.
arXiv Detail & Related papers (2024-07-17T21:26:27Z) - From Decoding to Meta-Generation: Inference-time Algorithms for Large Language Models [63.188607839223046]
This survey focuses on the benefits of scaling compute during inference.
We explore three areas under a unified mathematical formalism: token-level generation algorithms, meta-generation algorithms, and efficient generation.
arXiv Detail & Related papers (2024-06-24T17:45:59Z) - On Universally Optimal Algorithms for A/B Testing [49.429419538826444]
We study the problem of best-arm identification with fixed budget in multi-armed bandits with Bernoulli rewards.
For the problem with two arms, also known as the A/B testing problem, we prove that there is no algorithm that performs as well as the algorithm sampling each arm equally.
arXiv Detail & Related papers (2023-08-23T08:38:53Z) - Comparative study of subset selection methods for rapid prototyping of
3D object detection algorithms [0.0]
prototyping object detection algorithms is time-consuming and costly in terms of energy and environmental impact.
We present a comparison of three algorithms for selecting such a subset - random sampling, random per class sampling, and our proposed MONSPeC.
We provide empirical evidence for the superior effectiveness of random per class sampling and MONSPeC over basic random sampling.
arXiv Detail & Related papers (2023-06-30T11:09:20Z) - A Sequential Deep Learning Algorithm for Sampled Mixed-integer
Optimisation Problems [0.3867363075280544]
We introduce and analyse two efficient algorithms for mixed-integer optimisation problems.
We show that both algorithms exhibit finite-time convergence towards the optimal solution.
We establish quantitatively the efficacy of these algorithms by means of three numerical tests.
arXiv Detail & Related papers (2023-01-25T17:10:52Z) - Machine Learning for Online Algorithm Selection under Censored Feedback [71.6879432974126]
In online algorithm selection (OAS), instances of an algorithmic problem class are presented to an agent one after another, and the agent has to quickly select a presumably best algorithm from a fixed set of candidate algorithms.
For decision problems such as satisfiability (SAT), quality typically refers to the algorithm's runtime.
In this work, we revisit multi-armed bandit algorithms for OAS and discuss their capability of dealing with the problem.
We adapt them towards runtime-oriented losses, allowing for partially censored data while keeping a space- and time-complexity independent of the time horizon.
arXiv Detail & Related papers (2021-09-13T18:10:52Z) - Adaptive Sampling for Best Policy Identification in Markov Decision
Processes [79.4957965474334]
We investigate the problem of best-policy identification in discounted Markov Decision (MDPs) when the learner has access to a generative model.
The advantages of state-of-the-art algorithms are discussed and illustrated.
arXiv Detail & Related papers (2020-09-28T15:22:24Z) - A new heuristic algorithm for fast k-segmentation [0.0]
Exact and approximate methods for $k$-segmentation exist in the literature.
A novel algorithm is proposed in this paper to improve upon existing methods.
It is empirically found to provide accuracies competitive with exact methods at a fraction of the computational expense.
arXiv Detail & Related papers (2020-09-02T04:50:17Z) - Extreme Algorithm Selection With Dyadic Feature Representation [78.13985819417974]
We propose the setting of extreme algorithm selection (XAS) where we consider fixed sets of thousands of candidate algorithms.
We assess the applicability of state-of-the-art AS techniques to the XAS setting and propose approaches leveraging a dyadic feature representation.
arXiv Detail & Related papers (2020-01-29T09:40:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.