Learning Informative Representations of Biomedical Relations with Latent
Variable Models
- URL: http://arxiv.org/abs/2011.10285v1
- Date: Fri, 20 Nov 2020 08:56:31 GMT
- Title: Learning Informative Representations of Biomedical Relations with Latent
Variable Models
- Authors: Harshil Shah and Julien Fauqueur
- Abstract summary: We propose a latent variable model with an arbitrarily flexible distribution to represent the relation between an entity pair.
We demonstrate that our model achieves results competitive with strong baselines for both tasks while having fewer parameters and being significantly faster to train.
- Score: 2.4366811507669115
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Extracting biomedical relations from large corpora of scientific documents is
a challenging natural language processing task. Existing approaches usually
focus on identifying a relation either in a single sentence (mention-level) or
across an entire corpus (pair-level). In both cases, recent methods have
achieved strong results by learning a point estimate to represent the relation;
this is then used as the input to a relation classifier. However, the relation
expressed in text between a pair of biomedical entities is often more complex
than can be captured by a point estimate. To address this issue, we propose a
latent variable model with an arbitrarily flexible distribution to represent
the relation between an entity pair. Additionally, our model provides a unified
architecture for both mention-level and pair-level relation extraction. We
demonstrate that our model achieves results competitive with strong baselines
for both tasks while having fewer parameters and being significantly faster to
train. We make our code publicly available.
Related papers
- Entity or Relation Embeddings? An Analysis of Encoding Strategies for Relation Extraction [19.019881161010474]
Relation extraction is essentially a text classification problem, which can be tackled by fine-tuning a pre-trained language model (LM)
Existing approaches therefore solve the problem in an indirect way: they fine-tune an LM to learn embeddings of the head and tail entities, and then predict the relationship from these entity embeddings.
Our hypothesis in this paper is that relation extraction models can be improved by capturing relationships in a more direct way.
arXiv Detail & Related papers (2023-12-18T09:58:19Z) - Comparative Analysis of Contextual Relation Extraction based on Deep
Learning Models [0.0]
An efficient and accurate CRE system is essential for creating domain knowledge in the biomedical industry.
Deep learning techniques have been used to identify the appropriate semantic relation based on the context from multiple sentences.
This paper explores the analysis of various deep learning models that are used for relation extraction.
arXiv Detail & Related papers (2023-09-13T09:05:09Z) - More than Classification: A Unified Framework for Event Temporal
Relation Extraction [61.44799147458621]
Event temporal relation extraction(ETRE) is usually formulated as a multi-label classification task.
We observe that all relations can be interpreted using the start and end time points of events.
We propose a unified event temporal relation extraction framework, which transforms temporal relations into logical expressions of time points.
arXiv Detail & Related papers (2023-05-28T02:09:08Z) - Relational Sentence Embedding for Flexible Semantic Matching [86.21393054423355]
We present Sentence Embedding (RSE), a new paradigm to discover further the potential of sentence embeddings.
RSE is effective and flexible in modeling sentence relations and outperforms a series of state-of-the-art embedding methods.
arXiv Detail & Related papers (2022-12-17T05:25:17Z) - An Empirical Study on Relation Extraction in the Biomedical Domain [0.0]
We consider both sentence-level and document-level relation extraction, and run a few state-of-the-art methods on several benchmark datasets.
Our results show that (1) current document-level relation extraction methods have strong generalization ability; (2) existing methods require a large amount of labeled data for model fine-tuning in biomedicine.
arXiv Detail & Related papers (2021-12-11T03:36:38Z) - Discovering Latent Representations of Relations for Interacting Systems [2.2844557930775484]
We propose the DiScovering Latent Relation (DSLR) model, which is flexibly applicable even if the number of relations is unknown or many types of relations exist.
The flexibility of our DSLR model comes from the design concept of our encoder that represents the relation between entities in a latent space.
The experiments show that the proposed method is suitable for analyzing dynamic graphs with an unknown number of complex relations.
arXiv Detail & Related papers (2021-11-10T03:32:09Z) - Prototypical Representation Learning for Relation Extraction [56.501332067073065]
This paper aims to learn predictive, interpretable, and robust relation representations from distantly-labeled data.
We learn prototypes for each relation from contextual information to best explore the intrinsic semantics of relations.
Results on several relation learning tasks show that our model significantly outperforms the previous state-of-the-art relational models.
arXiv Detail & Related papers (2021-03-22T08:11:43Z) - Learning Relation Prototype from Unlabeled Texts for Long-tail Relation
Extraction [84.64435075778988]
We propose a general approach to learn relation prototypes from unlabeled texts.
We learn relation prototypes as an implicit factor between entities.
We conduct experiments on two publicly available datasets: New York Times and Google Distant Supervision.
arXiv Detail & Related papers (2020-11-27T06:21:12Z) - Learning to Decouple Relations: Few-Shot Relation Classification with
Entity-Guided Attention and Confusion-Aware Training [49.9995628166064]
We propose CTEG, a model equipped with two mechanisms to learn to decouple easily-confused relations.
On the one hand, an EGA mechanism is introduced to guide the attention to filter out information causing confusion.
On the other hand, a Confusion-Aware Training (CAT) method is proposed to explicitly learn to distinguish relations.
arXiv Detail & Related papers (2020-10-21T11:07:53Z) - Relation-Guided Representation Learning [53.60351496449232]
We propose a new representation learning method that explicitly models and leverages sample relations.
Our framework well preserves the relations between samples.
By seeking to embed samples into subspace, we show that our method can address the large-scale and out-of-sample problem.
arXiv Detail & Related papers (2020-07-11T10:57:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.