Entity Context Graph: Learning Entity Representations
fromSemi-Structured Textual Sources on the Web
- URL: http://arxiv.org/abs/2103.15950v1
- Date: Mon, 29 Mar 2021 20:52:14 GMT
- Title: Entity Context Graph: Learning Entity Representations
fromSemi-Structured Textual Sources on the Web
- Authors: Kalpa Gunaratna, Yu Wang, Hongxia Jin
- Abstract summary: We propose an approach that processes entity centric textual knowledge sources to learn entity embeddings.
We show that the embeddings learned from our approach are: (i) high quality and comparable to a known knowledge graph-based embeddings and can be used to improve them further.
- Score: 44.92858943475407
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Knowledge is captured in the form of entities and their relationships and
stored in knowledge graphs. Knowledge graphs enhance the capabilities of
applications in many different areas including Web search, recommendation, and
natural language understanding. This is mainly because, entities enable
machines to understand things that go beyond simple tokens. Many modern
algorithms use learned entity embeddings from these structured representations.
However, building a knowledge graph takes time and effort, hence very costly
and nontrivial. On the other hand, many Web sources describe entities in some
structured format and therefore, finding ways to get them into useful entity
knowledge is advantageous. We propose an approach that processes entity centric
textual knowledge sources to learn entity embeddings and in turn avoids the
need for a traditional knowledge graph. We first extract triples into the new
representation format that does not use traditional complex triple extraction
methods defined by pre-determined relationship labels. Then we learn entity
embeddings through this new type of triples. We show that the embeddings
learned from our approach are: (i) high quality and comparable to a known
knowledge graph-based embeddings and can be used to improve them further, (ii)
better than a contextual language model-based entity embeddings, and (iii) easy
to compute and versatile in domain-specific applications where a knowledge
graph is not readily available
Related papers
- Inference over Unseen Entities, Relations and Literals on Knowledge Graphs [1.7474352892977463]
knowledge graph embedding models have been successfully applied in the transductive setting to tackle various challenging tasks.
We propose the attentive byte-pair encoding layer (BytE) to construct a triple embedding from a sequence of byte-pair encoded subword units of entities and relations.
BytE leads to massive feature reuse via weight tying, since it forces a knowledge graph embedding model to learn embeddings for subword units instead of entities and relations directly.
arXiv Detail & Related papers (2024-10-09T10:20:54Z) - Universal Knowledge Graph Embeddings [4.322134229203427]
We propose to learn universal knowledge graph embeddings from large-scale knowledge sources.
We instantiate our idea by computing universal embeddings based on DBpedia and Wikidata for about 180 million entities, 15 thousand relations, and 1.2 billion triples.
arXiv Detail & Related papers (2023-10-23T13:07:46Z) - Joint Language Semantic and Structure Embedding for Knowledge Graph
Completion [66.15933600765835]
We propose to jointly embed the semantics in the natural language description of the knowledge triplets with their structure information.
Our method embeds knowledge graphs for the completion task via fine-tuning pre-trained language models.
Our experiments on a variety of knowledge graph benchmarks have demonstrated the state-of-the-art performance of our method.
arXiv Detail & Related papers (2022-09-19T02:41:02Z) - One-shot Scene Graph Generation [130.57405850346836]
We propose Multiple Structured Knowledge (Relational Knowledgesense Knowledge) for the one-shot scene graph generation task.
Our method significantly outperforms existing state-of-the-art methods by a large margin.
arXiv Detail & Related papers (2022-02-22T11:32:59Z) - Taxonomy Enrichment with Text and Graph Vector Representations [61.814256012166794]
We address the problem of taxonomy enrichment which aims at adding new words to the existing taxonomy.
We present a new method that allows achieving high results on this task with little effort.
We achieve state-of-the-art results across different datasets and provide an in-depth error analysis of mistakes.
arXiv Detail & Related papers (2022-01-21T09:01:12Z) - KI-BERT: Infusing Knowledge Context for Better Language and Domain
Understanding [0.0]
We propose a technique to infuse knowledge context from knowledge graphs for conceptual and ambiguous entities into models based on transformer architecture.
Our novel technique project knowledge graph embedding in the homogeneous vector-space, introduces new token-types for entities, align entity position ids, and a selective attention mechanism.
We take BERT as a baseline model and implement "KnowledgeInfused BERT" by infusing knowledge context from ConceptNet and WordNet.
arXiv Detail & Related papers (2021-04-09T16:15:31Z) - Learning semantic Image attributes using Image recognition and knowledge
graph embeddings [0.3222802562733786]
We propose a shared learning approach to learn semantic attributes of images by combining a knowledge graph embedding model with the recognized attributes of images.
The proposed approach is a step towards bridging the gap between frameworks which learn from large amounts of data and frameworks which use a limited set of predicates to infer new knowledge.
arXiv Detail & Related papers (2020-09-12T15:18:48Z) - ENT-DESC: Entity Description Generation by Exploring Knowledge Graph [53.03778194567752]
In practice, the input knowledge could be more than enough, since the output description may only cover the most significant knowledge.
We introduce a large-scale and challenging dataset to facilitate the study of such a practical scenario in KG-to-text.
We propose a multi-graph structure that is able to represent the original graph information more comprehensively.
arXiv Detail & Related papers (2020-04-30T14:16:19Z) - Exploiting Structured Knowledge in Text via Graph-Guided Representation
Learning [73.0598186896953]
We present two self-supervised tasks learning over raw text with the guidance from knowledge graphs.
Building upon entity-level masked language models, our first contribution is an entity masking scheme.
In contrast to existing paradigms, our approach uses knowledge graphs implicitly, only during pre-training.
arXiv Detail & Related papers (2020-04-29T14:22:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.