Explicit Syntactic Guidance for Neural Text Generation
- URL: http://arxiv.org/abs/2306.11485v2
- Date: Sun, 25 Jun 2023 13:32:14 GMT
- Title: Explicit Syntactic Guidance for Neural Text Generation
- Authors: Yafu Li, Leyang Cui, Jianhao Yan, Yongjing Yin, Wei Bi, Shuming Shi,
Yue Zhang
- Abstract summary: Generative Grammar suggests that humans generate natural language texts by learning language grammar.
We propose a syntax-guided generation schema, which generates the sequence guided by a constituency parse tree in a top-down direction.
Experiments on paraphrase generation and machine translation show that the proposed method outperforms autoregressive baselines.
- Score: 45.60838824233036
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Most existing text generation models follow the sequence-to-sequence
paradigm. Generative Grammar suggests that humans generate natural language
texts by learning language grammar. We propose a syntax-guided generation
schema, which generates the sequence guided by a constituency parse tree in a
top-down direction. The decoding process can be decomposed into two parts: (1)
predicting the infilling texts for each constituent in the lexicalized syntax
context given the source sentence; (2) mapping and expanding each constituent
to construct the next-level syntax context. Accordingly, we propose a
structural beam search method to find possible syntax structures
hierarchically. Experiments on paraphrase generation and machine translation
show that the proposed method outperforms autoregressive baselines, while also
demonstrating effectiveness in terms of interpretability, controllability, and
diversity.
Related papers
- On Eliciting Syntax from Language Models via Hashing [19.872554909401316]
Unsupervised parsing aims to infer syntactic structure from raw text.
In this paper, we explore the possibility of leveraging this capability to deduce parsing trees from raw text.
We show that our method is effective and efficient enough to acquire high-quality parsing trees from pre-trained language models at a low cost.
arXiv Detail & Related papers (2024-10-05T08:06:19Z) - Instruct-SCTG: Guiding Sequential Controlled Text Generation through
Instructions [42.67608830386934]
Instruct-SCTG is a sequential framework that harnesses instruction-tuned language models to generate structurally coherent text.
Our framework generates articles in a section-by-section manner, aligned with the desired human structure using natural language instructions.
arXiv Detail & Related papers (2023-12-19T16:20:49Z) - Efficient Guided Generation for Large Language Models [0.21485350418225244]
We show how the problem of neural text generation can be constructively reformulated in terms of transitions between the states of a finite-state machine.
This framework leads to an efficient approach to guiding text generation with regular expressions and context-free grammars.
arXiv Detail & Related papers (2023-07-19T01:14:49Z) - Sequentially Controlled Text Generation [97.22539956688443]
GPT-2 generates sentences that are remarkably human-like, longer documents can ramble and do not follow human-like writing structure.
We study the problem of imposing structure on long-range text.
We develop a sequential controlled text generation pipeline with generation and editing.
arXiv Detail & Related papers (2023-01-05T21:23:51Z) - The Whole Truth and Nothing But the Truth: Faithful and Controllable
Dialogue Response Generation with Dataflow Transduction and Constrained
Decoding [65.34601470417967]
We describe a hybrid architecture for dialogue response generation that combines the strengths of neural language modeling and rule-based generation.
Our experiments show that this system outperforms both rule-based and learned approaches in human evaluations of fluency, relevance, and truthfulness.
arXiv Detail & Related papers (2022-09-16T09:00:49Z) - Example-Based Machine Translation from Text to a Hierarchical
Representation of Sign Language [1.3999481573773074]
This article presents an original method for Text-to-Sign Translation.
It compensates data scarcity using a domain-specific parallel corpus of alignments between text and hierarchical formal descriptions of Sign Language videos in AZee.
Based on the detection of similarities present in the source text, the proposed algorithm exploits matches and substitutions of aligned segments to build multiple candidate translations.
The resulting translations are in the form of AZee expressions, designed to be used as input to avatar systems.
arXiv Detail & Related papers (2022-05-06T15:48:43Z) - Text Revision by On-the-Fly Representation Optimization [76.11035270753757]
Current state-of-the-art methods formulate these tasks as sequence-to-sequence learning problems.
We present an iterative in-place editing approach for text revision, which requires no parallel data.
It achieves competitive and even better performance than state-of-the-art supervised methods on text simplification.
arXiv Detail & Related papers (2022-04-15T07:38:08Z) - Long Text Generation by Modeling Sentence-Level and Discourse-Level
Coherence [59.51720326054546]
We propose a long text generation model, which can represent the prefix sentences at sentence level and discourse level in the decoding process.
Our model can generate more coherent texts than state-of-the-art baselines.
arXiv Detail & Related papers (2021-05-19T07:29:08Z) - Compositional Generalization via Semantic Tagging [81.24269148865555]
We propose a new decoding framework that preserves the expressivity and generality of sequence-to-sequence models.
We show that the proposed approach consistently improves compositional generalization across model architectures, domains, and semantic formalisms.
arXiv Detail & Related papers (2020-10-22T15:55:15Z) - Syntax-driven Iterative Expansion Language Models for Controllable Text
Generation [2.578242050187029]
We propose a new paradigm for introducing a syntactic inductive bias into neural text generation.
Our experiments show that this paradigm is effective at text generation, with quality between LSTMs and Transformers, and comparable diversity.
arXiv Detail & Related papers (2020-04-05T14:29:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.