Flatness-Aware Prompt Selection Improves Accuracy and Sample Efficiency
- URL: http://arxiv.org/abs/2305.10713v2
- Date: Mon, 23 Oct 2023 01:22:44 GMT
- Title: Flatness-Aware Prompt Selection Improves Accuracy and Sample Efficiency
- Authors: Lingfeng Shen, Weiting Tan, Boyuan Zheng, Daniel Khashabi
- Abstract summary: We introduce prompt flatness, a new metric to quantify the expected utility of a language prompt.
We show that combining prompt flatness with existing metrics improves both performance and sample efficiency.
- Score: 26.829610705207955
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: With growing capabilities of large language models, prompting them has become
the dominant way to access them. This has motivated the development of
strategies for automatically selecting effective language prompts. In this
paper, we introduce prompt flatness, a new metric to quantify the expected
utility of a language prompt. This metric is inspired by flatness
regularization in statistical learning that quantifies the robustness of the
model towards its parameter perturbations. We provide theoretical foundations
for this metric and its relationship with other prompt selection metrics,
providing a comprehensive understanding of existing methods. Empirically, we
show that combining prompt flatness with existing metrics improves both
performance and sample efficiency. Our metric outperforms the previous prompt
selection metrics with an average increase of 5% in accuracy and 10% in Pearson
correlation across 6 classification benchmarks.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.