Topic-to-Essay Generation with Comprehensive Knowledge Enhancement
- URL: http://arxiv.org/abs/2106.15142v1
- Date: Tue, 29 Jun 2021 08:01:42 GMT
- Title: Topic-to-Essay Generation with Comprehensive Knowledge Enhancement
- Authors: Zhiyue Liu, Jiahai Wang, Zhenghong Li
- Abstract summary: This paper aims to improve essay generation by extracting information from both internal and external knowledge.
For internal knowledge enhancement, both topics and related essays are fed to a teacher network as source information.
For external knowledge enhancement, a topic knowledge graph encoder is proposed.
- Score: 8.373915325503289
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Generating high-quality and diverse essays with a set of topics is a
challenging task in natural language generation. Since several given topics
only provide limited source information, utilizing various topic-related
knowledge is essential for improving essay generation performance. However,
previous works cannot sufficiently use that knowledge to facilitate the
generation procedure. This paper aims to improve essay generation by extracting
information from both internal and external knowledge. Thus, a topic-to-essay
generation model with comprehensive knowledge enhancement, named TEGKE, is
proposed. For internal knowledge enhancement, both topics and related essays
are fed to a teacher network as source information. Then, informative features
would be obtained from the teacher network and transferred to a student network
which only takes topics as input but provides comparable information compared
with the teacher network. For external knowledge enhancement, a topic knowledge
graph encoder is proposed. Unlike the previous works only using the nearest
neighbors of topics in the commonsense base, our topic knowledge graph encoder
could exploit more structural and semantic information of the commonsense
knowledge graph to facilitate essay generation. Moreover, the adversarial
training based on the Wasserstein distance is proposed to improve generation
quality. Experimental results demonstrate that TEGKE could achieve
state-of-the-art performance on both automatic and human evaluation.
Related papers
- Artificial Intelligence Driven Course Generation: A Case Study Using ChatGPT [0.0]
The study aims to elaborate on using ChatGPT to create course materials.
The main objective is to assess the efficiency, quality, and impact of AI-driven course generation.
The study highlights the potential of AI to revolutionize educational content creation.
arXiv Detail & Related papers (2024-11-02T21:59:02Z) - Towards Knowledge-Grounded Natural Language Understanding and Generation [1.450405446885067]
This thesis investigates how natural language understanding and generation with transformer models can benefit from grounding the models with knowledge representations.
Studies in this thesis find that incorporating relevant and up-to-date knowledge of entities benefits fake news detection.
It is established that other general forms of knowledge, such as parametric and distilled knowledge, enhance multimodal and multilingual knowledge-intensive tasks.
arXiv Detail & Related papers (2024-03-22T17:32:43Z) - Beyond Factuality: A Comprehensive Evaluation of Large Language Models
as Knowledge Generators [78.63553017938911]
Large language models (LLMs) outperform information retrieval techniques for downstream knowledge-intensive tasks.
However, community concerns abound regarding the factuality and potential implications of using this uncensored knowledge.
We introduce CONNER, designed to evaluate generated knowledge from six important perspectives.
arXiv Detail & Related papers (2023-10-11T08:22:37Z) - Informative Text Generation from Knowledge Triples [56.939571343797304]
We propose a novel memory augmented generator that employs a memory network to memorize the useful knowledge learned during the training.
We derive a dataset from WebNLG for our new setting and conduct extensive experiments to investigate the effectiveness of our model.
arXiv Detail & Related papers (2022-09-26T14:35:57Z) - TegTok: Augmenting Text Generation via Task-specific and Open-world
Knowledge [83.55215993730326]
We propose augmenting TExt Generation via Task-specific and Open-world Knowledge (TegTok) in a unified framework.
Our model selects knowledge entries from two types of knowledge sources through dense retrieval and then injects them into the input encoding and output decoding stages respectively.
arXiv Detail & Related papers (2022-03-16T10:37:59Z) - Knowledge-Grounded Dialogue Generation with a Unified Knowledge
Representation [78.85622982191522]
Existing systems perform poorly on unseen topics due to limited topics covered in the training data.
We present PLUG, a language model that homogenizes different knowledge sources to a unified knowledge representation.
It can achieve comparable performance with state-of-the-art methods under a fully-supervised setting.
arXiv Detail & Related papers (2021-12-15T07:11:02Z) - A Sentiment-Controllable Topic-to-Essay Generator with Topic Knowledge
Graph [44.00244549852883]
We propose a novel Sentiment-Controllable topic-to-essay generator with a Topic Knowledge Graph enhanced decoder.
We firstly inject the sentiment information into the generator for controlling sentiment for each sentence, which leads to various generated essays.
Unlike existing models that use knowledge entities separately, our model treats the knowledge graph as a whole and encodes more structured, connected semantic information in the graph to generate a more relevant essay.
arXiv Detail & Related papers (2020-10-12T08:06:12Z) - A Survey of Knowledge-Enhanced Text Generation [81.24633231919137]
The goal of text generation is to make machines express in human language.
Various neural encoder-decoder models have been proposed to achieve the goal by learning to map input text to output text.
To address this issue, researchers have considered incorporating various forms of knowledge beyond the input text into the generation models.
arXiv Detail & Related papers (2020-10-09T06:46:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.