An AMR-based Link Prediction Approach for Document-level Event Argument
Extraction
- URL: http://arxiv.org/abs/2305.19162v1
- Date: Tue, 30 May 2023 16:07:48 GMT
- Title: An AMR-based Link Prediction Approach for Document-level Event Argument
Extraction
- Authors: Yuqing Yang, Qipeng Guo, Xiangkun Hu, Yue Zhang, Xipeng Qiu, Zheng
Zhang
- Abstract summary: Recent works have introduced Abstract Meaning Representation (AMR) for Document-level Event Argument Extraction (Doc-level EAE)
This work reformulates EAE as a link prediction problem on AMR graphs.
We propose a novel graph structure, Tailored AMR Graph (TAG), which compresses less informative subgraphs and edge types, integrates span information, and highlights surrounding events in the same document.
- Score: 51.77733454436013
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent works have introduced Abstract Meaning Representation (AMR) for
Document-level Event Argument Extraction (Doc-level EAE), since AMR provides a
useful interpretation of complex semantic structures and helps to capture
long-distance dependency. However, in these works AMR is used only implicitly,
for instance, as additional features or training signals. Motivated by the fact
that all event structures can be inferred from AMR, this work reformulates EAE
as a link prediction problem on AMR graphs. Since AMR is a generic structure
and does not perfectly suit EAE, we propose a novel graph structure, Tailored
AMR Graph (TAG), which compresses less informative subgraphs and edge types,
integrates span information, and highlights surrounding events in the same
document. With TAG, we further propose a novel method using graph neural
networks as a link prediction model to find event arguments. Our extensive
experiments on WikiEvents and RAMS show that this simpler approach outperforms
the state-of-the-art models by 3.63pt and 2.33pt F1, respectively, and do so
with reduced 56% inference time. The code is availabel at
https://github.com/ayyyq/TARA.
Related papers
- AMR Parsing with Causal Hierarchical Attention and Pointers [54.382865897298046]
We introduce new target forms of AMR parsing and a novel model, CHAP, which is equipped with causal hierarchical attention and the pointer mechanism.
Experiments show that our model outperforms baseline models on four out of five benchmarks in the setting of no additional data.
arXiv Detail & Related papers (2023-10-18T13:44:26Z) - Leveraging Denoised Abstract Meaning Representation for Grammatical
Error Correction [53.55440811942249]
Grammatical Error Correction (GEC) is the task of correcting errorful sentences into grammatically correct, semantically consistent, and coherent sentences.
We propose the AMR-GEC, a seq-to-seq model that incorporates denoised AMR as additional knowledge.
arXiv Detail & Related papers (2023-07-05T09:06:56Z) - Incorporating Graph Information in Transformer-based AMR Parsing [34.461828101932184]
LeakDistill is a model and method that explores a modification to the Transformer architecture.
We show how, by employing word-to-node alignment to embed graph structural information into the encoder at training time, we can obtain state-of-the-art AMR parsing.
arXiv Detail & Related papers (2023-06-23T12:12:08Z) - AMRs Assemble! Learning to Ensemble with Autoregressive Models for AMR
Parsing [38.731641198934646]
We show how ensemble models can exploit SMATCH metric weaknesses to obtain higher scores, but sometimes result in corrupted graphs.
We propose two novel ensemble strategies based on Transformer models, improving robustness to structural constraints, while also reducing computational time.
arXiv Detail & Related papers (2023-06-19T08:58:47Z) - Scientific Paper Extractive Summarization Enhanced by Citation Graphs [50.19266650000948]
We focus on leveraging citation graphs to improve scientific paper extractive summarization under different settings.
Preliminary results demonstrate that citation graph is helpful even in a simple unsupervised framework.
Motivated by this, we propose a Graph-based Supervised Summarization model (GSS) to achieve more accurate results on the task when large-scale labeled data are available.
arXiv Detail & Related papers (2022-12-08T11:53:12Z) - A Two-Stream AMR-enhanced Model for Document-level Event Argument
Extraction [32.54105023345553]
We propose a Two-Stream Abstract meaning Representation enhanced extraction model (TSAR)
TSAR encodes the document from different perspectives by a two-stream encoding module.
An AMR-guided interaction module captures both intra-sentential and inter-sentential features.
arXiv Detail & Related papers (2022-04-30T11:17:26Z) - Dynamic Semantic Graph Construction and Reasoning for Explainable
Multi-hop Science Question Answering [50.546622625151926]
We propose a new framework to exploit more valid facts while obtaining explainability for multi-hop QA.
Our framework contains three new ideas: (a) tt AMR-SG, an AMR-based Semantic Graph, constructed by candidate fact AMRs to uncover any hop relations among question, answer and multiple facts, (b) a novel path-based fact analytics approach exploiting tt AMR-SG to extract active facts from a large fact pool to answer questions, and (c) a fact-level relation modeling leveraging graph convolution network (GCN) to guide the reasoning process.
arXiv Detail & Related papers (2021-05-25T09:14:55Z) - Lightweight, Dynamic Graph Convolutional Networks for AMR-to-Text
Generation [56.73834525802723]
Lightweight Dynamic Graph Convolutional Networks (LDGCNs) are proposed.
LDGCNs capture richer non-local interactions by synthesizing higher order information from the input graphs.
We develop two novel parameter saving strategies based on the group graph convolutions and weight tied convolutions to reduce memory usage and model complexity.
arXiv Detail & Related papers (2020-10-09T06:03:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.