Cross-domain Generalization for AMR Parsing
- URL: http://arxiv.org/abs/2210.12445v1
- Date: Sat, 22 Oct 2022 13:24:13 GMT
- Title: Cross-domain Generalization for AMR Parsing
- Authors: Xuefeng Bai, Seng Yang, Leyang Cui, Linfeng Song and Yue Zhang
- Abstract summary: We evaluate five representative AMRs on five domains and analyze challenges to cross-domain AMR parsing.
Based on our observation, we investigate two approaches to reduce the domain distribution divergence of text and AMR features.
- Score: 30.34105706152887
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Abstract Meaning Representation (AMR) parsing aims to predict an AMR graph
from textual input. Recently, there has been notable growth in AMR parsing
performance. However, most existing work focuses on improving the performance
in the specific domain, ignoring the potential domain dependence of AMR parsing
systems. To address this, we extensively evaluate five representative AMR
parsers on five domains and analyze challenges to cross-domain AMR parsing. We
observe that challenges to cross-domain AMR parsing mainly arise from the
distribution shift of words and AMR concepts. Based on our observation, we
investigate two approaches to reduce the domain distribution divergence of text
and AMR features, respectively. Experimental results on two out-of-domain test
sets show the superiority of our method.
Related papers
- AMR Parsing is Far from Solved: GrAPES, the Granular AMR Parsing
Evaluation Suite [18.674172788583967]
Granular AMR Parsing Evaluation Suite (GrAPES)
We present the Granular AMR Parsing Evaluation Suite (GrAPES)
GrAPES reveals in depth the abilities and shortcomings of current AMRs.
arXiv Detail & Related papers (2023-12-06T13:19:56Z) - Retrofitting Multilingual Sentence Embeddings with Abstract Meaning
Representation [70.58243648754507]
We introduce a new method to improve existing multilingual sentence embeddings with Abstract Meaning Representation (AMR)
Compared with the original textual input, AMR is a structured semantic representation that presents the core concepts and relations in a sentence explicitly and unambiguously.
Experiment results show that retrofitting multilingual sentence embeddings with AMR leads to better state-of-the-art performance on both semantic similarity and transfer tasks.
arXiv Detail & Related papers (2022-10-18T11:37:36Z) - A Survey : Neural Networks for AMR-to-Text [2.3924114046608627]
AMR-to-Text is one of the key techniques in the NLP community that aims at generating sentences from the Abstract Meaning Representation (AMR) graphs.
Since AMR was proposed in 2013, the study on AMR-to-Text has become increasingly prevalent as an essential branch of structured data to text.
arXiv Detail & Related papers (2022-06-15T07:20:28Z) - ATP: AMRize Then Parse! Enhancing AMR Parsing with PseudoAMRs [34.55175412186001]
auxiliary tasks which are semantically or formally related can better enhance AMR parsing.
From an empirical perspective, we propose a principled method to involve auxiliary tasks to boost AMR parsing.
arXiv Detail & Related papers (2022-04-19T13:15:59Z) - Vector-Decomposed Disentanglement for Domain-Invariant Object Detection [75.64299762397268]
We try to disentangle domain-invariant representations from domain-specific representations.
In the experiment, we evaluate our method on the single- and compound-target case.
arXiv Detail & Related papers (2021-08-15T07:58:59Z) - Probabilistic, Structure-Aware Algorithms for Improved Variety,
Accuracy, and Coverage of AMR Alignments [9.74672460306765]
We present algorithms for aligning components of Abstract Meaning Representation (AMR) spans in English sentences.
We leverage unsupervised learning in combination with graphs, taking the best of both worlds from previous AMR.
Our approach covers a wider variety of AMR substructures than previously considered, achieves higher coverage of nodes and edges, and does so with higher accuracy.
arXiv Detail & Related papers (2021-06-10T18:46:32Z) - Heuristic Domain Adaptation [105.59792285047536]
Heuristic Domain Adaptation Network (HDAN) explicitly learns the domain-invariant and domain-specific representations.
Heuristic Domain Adaptation Network (HDAN) has exceeded state-of-the-art on unsupervised DA, multi-source DA and semi-supervised DA.
arXiv Detail & Related papers (2020-11-30T04:21:35Z) - Pushing the Limits of AMR Parsing with Self-Learning [24.998016423211375]
We show how trained models can be applied to improve AMR parsing performance.
We show that without any additional human annotations, these techniques improve an already performant and achieve state-of-the-art results.
arXiv Detail & Related papers (2020-10-20T23:45:04Z) - Domain2Vec: Domain Embedding for Unsupervised Domain Adaptation [56.94873619509414]
Conventional unsupervised domain adaptation studies the knowledge transfer between a limited number of domains.
We propose a novel Domain2Vec model to provide vectorial representations of visual domains based on joint learning of feature disentanglement and Gram matrix.
We demonstrate that our embedding is capable of predicting domain similarities that match our intuition about visual relations between different domains.
arXiv Detail & Related papers (2020-07-17T22:05:09Z) - Normalizing Compositional Structures Across Graphbanks [67.7047900945161]
We present a methodology for normalizing discrepancies between MRs at the compositional level.
Our work significantly increases the match in compositional structure between MRs and improves multi-task learning.
arXiv Detail & Related papers (2020-04-29T14:35:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.