DiscoPrompt: Path Prediction Prompt Tuning for Implicit Discourse
Relation Recognition
- URL: http://arxiv.org/abs/2305.03973v1
- Date: Sat, 6 May 2023 08:16:07 GMT
- Title: DiscoPrompt: Path Prediction Prompt Tuning for Implicit Discourse
Relation Recognition
- Authors: Chunkit Chan, Xin Liu, Jiayang Cheng, Zihan Li, Yangqiu Song, Ginny Y.
Wong, Simon See
- Abstract summary: We propose a prompt-based path prediction method to utilize the interactive information and intrinsic senses among the hierarchy in IDRR.
This is the first work that injects such structure information into pre-trained language models via prompt tuning.
- Score: 27.977742959064916
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Implicit Discourse Relation Recognition (IDRR) is a sophisticated and
challenging task to recognize the discourse relations between the arguments
with the absence of discourse connectives. The sense labels for each discourse
relation follow a hierarchical classification scheme in the annotation process
(Prasad et al., 2008), forming a hierarchy structure. Most existing works do
not well incorporate the hierarchy structure but focus on the syntax features
and the prior knowledge of connectives in the manner of pure text
classification. We argue that it is more effective to predict the paths inside
the hierarchical tree (e.g., "Comparison -> Contrast -> however") rather than
flat labels (e.g., Contrast) or connectives (e.g., however). We propose a
prompt-based path prediction method to utilize the interactive information and
intrinsic senses among the hierarchy in IDRR. This is the first work that
injects such structure information into pre-trained language models via prompt
tuning, and the performance of our solution shows significant and consistent
improvement against competitive baselines.
Related papers
- Integrating Hierarchical Semantic into Iterative Generation Model for Entailment Tree Explanation [7.5496857647335585]
We propose an architecture of integrating the Hierarchical Semantics of sentences under the framework of Controller-Generator (HiSCG) to explain answers.
The proposed method achieves comparable performance on all three settings of the EntailmentBank dataset.
arXiv Detail & Related papers (2024-09-26T11:46:58Z) - Coreference Graph Guidance for Mind-Map Generation [5.289044688419791]
Recently, a state-of-the-art method encodes the sentences of a document sequentially and converts them to a relation graph via sequence-to-graph.
We propose a coreference-guided mind-map generation network (CMGN) to incorporate external structure knowledge.
arXiv Detail & Related papers (2023-12-19T09:39:27Z) - Prompt-based Logical Semantics Enhancement for Implicit Discourse
Relation Recognition [4.7938839332508945]
We propose a Prompt-based Logical Semantics Enhancement (PLSE) method for Implicit Discourse Relation Recognition (IDRR)
Our method seamlessly injects knowledge relevant to discourse relation into pre-trained language models through prompt-based connective prediction.
Experimental results on PDTB 2.0 and CoNLL16 datasets demonstrate that our method achieves outstanding and consistent performance against the current state-of-the-art models.
arXiv Detail & Related papers (2023-11-01T08:38:08Z) - Structured Dialogue Discourse Parsing [79.37200787463917]
discourse parsing aims to uncover the internal structure of a multi-participant conversation.
We propose a principled method that improves upon previous work from two perspectives: encoding and decoding.
Experiments show that our method achieves new state-of-the-art, surpassing the previous model by 2.3 on STAC and 1.5 on Molweni.
arXiv Detail & Related papers (2023-06-26T22:51:01Z) - Modeling Hierarchical Reasoning Chains by Linking Discourse Units and
Key Phrases for Reading Comprehension [80.99865844249106]
We propose a holistic graph network (HGN) which deals with context at both discourse level and word level, as the basis for logical reasoning.
Specifically, node-level and type-level relations, which can be interpreted as bridges in the reasoning process, are modeled by a hierarchical interaction mechanism.
arXiv Detail & Related papers (2023-06-21T07:34:27Z) - Conversational Semantic Parsing using Dynamic Context Graphs [68.72121830563906]
We consider the task of conversational semantic parsing over general purpose knowledge graphs (KGs) with millions of entities, and thousands of relation-types.
We focus on models which are capable of interactively mapping user utterances into executable logical forms.
arXiv Detail & Related papers (2023-05-04T16:04:41Z) - Global and Local Hierarchy-aware Contrastive Framework for Implicit
Discourse Relation Recognition [8.143877598684528]
implicit discourse relation recognition (IDRR) is a challenging task in discourse analysis.
Recent methods tend to integrate the whole hierarchical information of senses into discourse relation representations.
We propose a novel GlObal and Local Hierarchy-aware Contrastive Framework (GOLF), to model two kinds of hierarchies.
arXiv Detail & Related papers (2022-11-25T03:19:03Z) - Supporting Vision-Language Model Inference with Confounder-pruning Knowledge Prompt [71.77504700496004]
Vision-language models are pre-trained by aligning image-text pairs in a common space to deal with open-set visual concepts.
To boost the transferability of the pre-trained models, recent works adopt fixed or learnable prompts.
However, how and what prompts can improve inference performance remains unclear.
arXiv Detail & Related papers (2022-05-23T07:51:15Z) - KnowPrompt: Knowledge-aware Prompt-tuning with Synergistic Optimization
for Relation Extraction [111.74812895391672]
We propose a Knowledge-aware Prompt-tuning approach with synergistic optimization (KnowPrompt)
We inject latent knowledge contained in relation labels into prompt construction with learnable virtual type words and answer words.
arXiv Detail & Related papers (2021-04-15T17:57:43Z) - Exploring the Hierarchy in Relation Labels for Scene Graph Generation [75.88758055269948]
The proposed method can improve several state-of-the-art baselines by a large margin (up to $33%$ relative gain) in terms of Recall@50.
Experiments show that the proposed simple yet effective method can improve several state-of-the-art baselines by a large margin.
arXiv Detail & Related papers (2020-09-12T17:36:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.