Exploring the Suitability of Semantic Spaces as Word Association Models
for the Extraction of Semantic Relationships
- URL: http://arxiv.org/abs/2004.14265v1
- Date: Wed, 29 Apr 2020 15:25:28 GMT
- Title: Exploring the Suitability of Semantic Spaces as Word Association Models
for the Extraction of Semantic Relationships
- Authors: Epaminondas Kapetanios, Vijayan Sugumaran, and Anastassia Angelopoulou
- Abstract summary: We propose a novel idea of using classical semantic spaces and models, e.g., Word Embedding, generated for extracting word association.
The goal is to use these word association models to reinforce current relation extraction approaches.
- Score: 1.8352113484137629
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Given the recent advances and progress in Natural Language Processing (NLP),
extraction of semantic relationships has been at the top of the research agenda
in the last few years. This work has been mainly motivated by the fact that
building knowledge graphs (KG) and bases (KB), as a key ingredient of
intelligent applications, is a never-ending challenge, since new knowledge
needs to be harvested while old knowledge needs to be revised. Currently,
approaches towards relation extraction from text are dominated by neural models
practicing some sort of distant (weak) supervision in machine learning from
large corpora, with or without consulting external knowledge sources. In this
paper, we empirically study and explore the potential of a novel idea of using
classical semantic spaces and models, e.g., Word Embedding, generated for
extracting word association, in conjunction with relation extraction
approaches. The goal is to use these word association models to reinforce
current relation extraction approaches. We believe that this is a first attempt
of this kind and the results of the study should shed some light on the extent
to which these word association models can be used as well as the most
promising types of relationships to be considered for extraction.
Related papers
- Ontological Relations from Word Embeddings [2.384873896423002]
It has been reliably shown that the similarity of word embeddings obtained from popular neural models such as BERT approximates effectively a form of semantic similarity of the meaning of those words.
We show that a simple feed-forward architecture on top of those embeddings can achieve promising accuracies, with varying generalisation abilities depending on the input data.
arXiv Detail & Related papers (2024-08-01T10:31:32Z) - Retrieval-Enhanced Machine Learning: Synthesis and Opportunities [60.34182805429511]
Retrieval-enhancement can be extended to a broader spectrum of machine learning (ML)
This work introduces a formal framework of this paradigm, Retrieval-Enhanced Machine Learning (REML), by synthesizing the literature in various domains in ML with consistent notations which is missing from the current literature.
The goal of this work is to equip researchers across various disciplines with a comprehensive, formally structured framework of retrieval-enhanced models, thereby fostering interdisciplinary future research.
arXiv Detail & Related papers (2024-07-17T20:01:21Z) - Comparative Analysis of Contextual Relation Extraction based on Deep
Learning Models [0.0]
An efficient and accurate CRE system is essential for creating domain knowledge in the biomedical industry.
Deep learning techniques have been used to identify the appropriate semantic relation based on the context from multiple sentences.
This paper explores the analysis of various deep learning models that are used for relation extraction.
arXiv Detail & Related papers (2023-09-13T09:05:09Z) - Commonsense Knowledge Transfer for Pre-trained Language Models [83.01121484432801]
We introduce commonsense knowledge transfer, a framework to transfer the commonsense knowledge stored in a neural commonsense knowledge model to a general-purpose pre-trained language model.
It first exploits general texts to form queries for extracting commonsense knowledge from the neural commonsense knowledge model.
It then refines the language model with two self-supervised objectives: commonsense mask infilling and commonsense relation prediction.
arXiv Detail & Related papers (2023-06-04T15:44:51Z) - Multimodal Relation Extraction with Cross-Modal Retrieval and Synthesis [89.04041100520881]
This research proposes to retrieve textual and visual evidence based on the object, sentence, and whole image.
We develop a novel approach to synthesize the object-level, image-level, and sentence-level information for better reasoning between the same and different modalities.
arXiv Detail & Related papers (2023-05-25T15:26:13Z) - Learning Attention-based Representations from Multiple Patterns for
Relation Prediction in Knowledge Graphs [2.4028383570062606]
AEMP is a novel model for learning contextualized representations by acquiring entities' context information.
AEMP either outperforms or competes with state-of-the-art relation prediction methods.
arXiv Detail & Related papers (2022-06-07T10:53:35Z) - Learning Relation Prototype from Unlabeled Texts for Long-tail Relation
Extraction [84.64435075778988]
We propose a general approach to learn relation prototypes from unlabeled texts.
We learn relation prototypes as an implicit factor between entities.
We conduct experiments on two publicly available datasets: New York Times and Google Distant Supervision.
arXiv Detail & Related papers (2020-11-27T06:21:12Z) - Introducing Syntactic Structures into Target Opinion Word Extraction
with Deep Learning [89.64620296557177]
We propose to incorporate the syntactic structures of the sentences into the deep learning models for targeted opinion word extraction.
We also introduce a novel regularization technique to improve the performance of the deep learning models.
The proposed model is extensively analyzed and achieves the state-of-the-art performance on four benchmark datasets.
arXiv Detail & Related papers (2020-10-26T07:13:17Z) - Extracting Semantic Concepts and Relations from Scientific Publications
by Using Deep Learning [0.0]
The aim of this paper is to introduce a proposal of automatically extracting semantic concepts and relations from scientific publications.
This paper suggests new types of semantic relations and points out of using deep learning (DL) models for semantic relation extraction.
arXiv Detail & Related papers (2020-09-01T10:19:18Z) - A Dependency Syntactic Knowledge Augmented Interactive Architecture for
End-to-End Aspect-based Sentiment Analysis [73.74885246830611]
We propose a novel dependency syntactic knowledge augmented interactive architecture with multi-task learning for end-to-end ABSA.
This model is capable of fully exploiting the syntactic knowledge (dependency relations and types) by leveraging a well-designed Dependency Relation Embedded Graph Convolutional Network (DreGcn)
Extensive experimental results on three benchmark datasets demonstrate the effectiveness of our approach.
arXiv Detail & Related papers (2020-04-04T14:59:32Z) - Distributional semantic modeling: a revised technique to train term/word
vector space models applying the ontology-related approach [36.248702416150124]
We design a new technique for the distributional semantic modeling with a neural network-based approach to learn distributed term representations (or term embeddings)
Vec2graph is a Python library for visualizing word embeddings (term embeddings in our case) as dynamic and interactive graphs.
arXiv Detail & Related papers (2020-03-06T18:27:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.