Few-Shot Table-to-Text Generation with Prompt Planning and Knowledge
Memorization
- URL: http://arxiv.org/abs/2302.04415v3
- Date: Thu, 17 Aug 2023 01:51:48 GMT
- Title: Few-Shot Table-to-Text Generation with Prompt Planning and Knowledge
Memorization
- Authors: Zhixin Guo, Minyxuan Yan, Jiexing Qi, Jianping Zhou, Ziwei He, Zhouhan
Lin, Guanjie Zheng and Xinbing Wang
- Abstract summary: We suggest a new framework: PromptMize, which targets table-to-text generation under few-shot settings.
The design of our framework consists of two aspects: a prompt planner and a knowledge adapter.
Our model achieves remarkable performance in generating quality as judged by human and automatic evaluations.
- Score: 41.20314472839442
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Pre-trained language models (PLM) have achieved remarkable advancement in
table-to-text generation tasks. However, the lack of labeled domain-specific
knowledge and the topology gap between tabular data and text make it difficult
for PLMs to yield faithful text. Low-resource generation likewise faces unique
challenges in this domain. Inspired by how humans descript tabular data with
prior knowledge, we suggest a new framework: PromptMize, which targets
table-to-text generation under few-shot settings. The design of our framework
consists of two aspects: a prompt planner and a knowledge adapter. The prompt
planner aims to generate a prompt signal that provides instance guidance for
PLMs to bridge the topology gap between tabular data and text. Moreover, the
knowledge adapter memorizes domain-specific knowledge from the unlabelled
corpus to supply essential information during generation. Extensive experiments
and analyses are investigated on three open domain few-shot NLG datasets:
human, song, and book. Compared with previous state-of-the-art approaches, our
model achieves remarkable performance in generating quality as judged by human
and automatic evaluations.
Related papers
- DIVKNOWQA: Assessing the Reasoning Ability of LLMs via Open-Domain
Question Answering over Knowledge Base and Text [73.68051228972024]
Large Language Models (LLMs) have exhibited impressive generation capabilities, but they suffer from hallucinations when relying on their internal knowledge.
Retrieval-augmented LLMs have emerged as a potential solution to ground LLMs in external knowledge.
arXiv Detail & Related papers (2023-10-31T04:37:57Z) - Adapting Knowledge for Few-shot Table-to-Text Generation [35.59842534346997]
We propose a novel framework: Adapt-Knowledge-to-Generate (AKG)
AKG adapts unlabeled domain-specific knowledge into the model, which brings at least three benefits.
Our model achieves superior performance in terms of both fluency and accuracy as judged by human and automatic evaluations.
arXiv Detail & Related papers (2023-02-24T05:48:53Z) - TegTok: Augmenting Text Generation via Task-specific and Open-world
Knowledge [83.55215993730326]
We propose augmenting TExt Generation via Task-specific and Open-world Knowledge (TegTok) in a unified framework.
Our model selects knowledge entries from two types of knowledge sources through dense retrieval and then injects them into the input encoding and output decoding stages respectively.
arXiv Detail & Related papers (2022-03-16T10:37:59Z) - Attend, Memorize and Generate: Towards Faithful Table-to-Text Generation
in Few Shots [58.404516361586325]
Few-shot table-to-text generation is a task of composing fluent and faithful sentences to convey table content using limited data.
This paper proposes a novel approach, Memorize and Generate (called AMG), inspired by the text generation process of humans.
arXiv Detail & Related papers (2022-03-01T20:37:20Z) - Data-to-text Generation with Macro Planning [61.265321323312286]
We propose a neural model with a macro planning stage followed by a generation stage reminiscent of traditional methods.
Our approach outperforms competitive baselines in terms of automatic and human evaluation.
arXiv Detail & Related papers (2021-02-04T16:32:57Z) - Towards Faithful Neural Table-to-Text Generation with Content-Matching
Constraints [63.84063384518667]
We propose a novel Transformer-based generation framework to achieve the goal.
Core techniques in our method to enforce faithfulness include a new table-text optimal-transport matching loss.
To evaluate faithfulness, we propose a new automatic metric specialized to the table-to-text generation problem.
arXiv Detail & Related papers (2020-05-03T02:54:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.