Towards Unsupervised Language Understanding and Generation by Joint Dual
Learning
- URL: http://arxiv.org/abs/2004.14710v1
- Date: Thu, 30 Apr 2020 12:02:33 GMT
- Title: Towards Unsupervised Language Understanding and Generation by Joint Dual
Learning
- Authors: Shang-Yu Su, Chao-Wei Huang, Yun-Nung Chen
- Abstract summary: In modular dialogue systems, natural language understanding (NLU) and natural language generation (NLG) are critical components.
This paper introduces a general learning framework to effectively exploit such duality.
The proposed approach is capable of boosting the performance of both NLU and NLG.
- Score: 40.730699588561805
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In modular dialogue systems, natural language understanding (NLU) and natural
language generation (NLG) are two critical components, where NLU extracts the
semantics from the given texts and NLG is to construct corresponding natural
language sentences based on the input semantic representations. However, the
dual property between understanding and generation has been rarely explored.
The prior work is the first attempt that utilized the duality between NLU and
NLG to improve the performance via a dual supervised learning framework.
However, the prior work still learned both components in a supervised manner,
instead, this paper introduces a general learning framework to effectively
exploit such duality, providing flexibility of incorporating both supervised
and unsupervised learning algorithms to train language understanding and
generation models in a joint fashion. The benchmark experiments demonstrate
that the proposed approach is capable of boosting the performance of both NLU
and NLG.
Related papers
- ChatABL: Abductive Learning via Natural Language Interaction with
ChatGPT [72.83383437501577]
Large language models (LLMs) have recently demonstrated significant potential in mathematical abilities.
LLMs currently have difficulty in bridging perception, language understanding and reasoning capabilities.
This paper presents a novel method for integrating LLMs into the abductive learning framework.
arXiv Detail & Related papers (2023-04-21T16:23:47Z) - A Survey of Knowledge Enhanced Pre-trained Language Models [78.56931125512295]
We present a comprehensive review of Knowledge Enhanced Pre-trained Language Models (KE-PLMs)
For NLU, we divide the types of knowledge into four categories: linguistic knowledge, text knowledge, knowledge graph (KG) and rule knowledge.
The KE-PLMs for NLG are categorized into KG-based and retrieval-based methods.
arXiv Detail & Related papers (2022-11-11T04:29:02Z) - Towards More Robust Natural Language Understanding [0.0]
Natural Language Understanding (NLU) is branch of Natural Language Processing (NLP)
Recent years have witnessed notable progress across various NLU tasks with deep learning techniques.
It's worth noting that the human ability of understanding natural language is flexible and robust.
arXiv Detail & Related papers (2021-12-01T17:27:19Z) - ERICA: Improving Entity and Relation Understanding for Pre-trained
Language Models via Contrastive Learning [97.10875695679499]
We propose a novel contrastive learning framework named ERICA in pre-training phase to obtain a deeper understanding of the entities and their relations in text.
Experimental results demonstrate that our proposed ERICA framework achieves consistent improvements on several document-level language understanding tasks.
arXiv Detail & Related papers (2020-12-30T03:35:22Z) - SLM: Learning a Discourse Language Representation with Sentence
Unshuffling [53.42814722621715]
We introduce Sentence-level Language Modeling, a new pre-training objective for learning a discourse language representation.
We show that this feature of our model improves the performance of the original BERT by large margins.
arXiv Detail & Related papers (2020-10-30T13:33:41Z) - Dual Inference for Improving Language Understanding and Generation [35.251935231914366]
Natural language understanding (NLU) and Natural language generation (NLG) tasks hold a strong dual relationship.
NLU aims at predicting semantic labels based on natural language utterances and NLG does the opposite.
This paper proposes to leverage the duality in the inference stage without the need of retraining.
arXiv Detail & Related papers (2020-10-08T20:14:41Z) - A Generative Model for Joint Natural Language Understanding and
Generation [9.810053382574017]
We propose a generative model which couples NLU and NLG through a shared latent variable.
Our model achieves state-of-the-art performance on two dialogue datasets with both flat and tree-structured formal representations.
We also show that the model can be trained in a semi-supervised fashion by utilising unlabelled data to boost its performance.
arXiv Detail & Related papers (2020-06-12T22:38:55Z) - Dual Learning for Semi-Supervised Natural Language Understanding [29.692288627633374]
Natural language understanding (NLU) converts sentences into structured semantic forms.
We introduce a dual task of NLU, semantic-to-sentence generation (SSG)
We propose a new framework for semi-supervised NLU with the corresponding dual model.
arXiv Detail & Related papers (2020-04-26T07:17:48Z) - Logical Natural Language Generation from Open-Domain Tables [107.04385677577862]
We propose a new task where a model is tasked with generating natural language statements that can be emphlogically entailed by the facts.
To facilitate the study of the proposed logical NLG problem, we use the existing TabFact dataset citechen 2019tabfact featured with a wide range of logical/symbolic inferences.
The new task poses challenges to the existing monotonic generation frameworks due to the mismatch between sequence order and logical order.
arXiv Detail & Related papers (2020-04-22T06:03:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.