Textbook to triples: Creating knowledge graph in the form of triples
from AI TextBook
- URL: http://arxiv.org/abs/2111.10692v1
- Date: Sat, 20 Nov 2021 22:28:23 GMT
- Title: Textbook to triples: Creating knowledge graph in the form of triples
from AI TextBook
- Authors: Aman Kumar, Swathi Dinakaran
- Abstract summary: This paper develops a system that could convert the text from a given textbook into triples that can be used to visualize as a knowledge graph.
The initial assessment and evaluation gave promising results with an F1 score of 82%.
- Score: 0.8832969171530054
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: A knowledge graph is an essential and trending technology with great
applications in entity recognition, search, or question answering. There are a
plethora of methods in natural language processing for performing the task of
Named entity recognition; however, there are very few methods that could
provide triples for a domain-specific text. In this paper, an effort has been
made towards developing a system that could convert the text from a given
textbook into triples that can be used to visualize as a knowledge graph and
use for further applications. The initial assessment and evaluation gave
promising results with an F1 score of 82%.
Related papers
- Graphusion: Leveraging Large Language Models for Scientific Knowledge Graph Fusion and Construction in NLP Education [14.368011453534596]
We introduce Graphusion, a zero-shot knowledge graph framework from free text.
The core fusion module provides a global view of triplets, incorporating entity merging, conflict resolution, and novel triplet discovery.
Our evaluation demonstrates that Graphusion surpasses supervised baselines by up to 10% in accuracy on link prediction.
arXiv Detail & Related papers (2024-07-15T15:13:49Z) - Exploring Large Language Models for Knowledge Graph Completion [17.139056629060626]
We consider triples in knowledge graphs as text sequences and introduce an innovative framework called Knowledge Graph LLM.
Our technique employs entity and relation descriptions of a triple as prompts and utilizes the response for predictions.
Experiments on various benchmark knowledge graphs demonstrate that our method attains state-of-the-art performance in tasks such as triple classification and relation prediction.
arXiv Detail & Related papers (2023-08-26T16:51:17Z) - Text-Augmented Open Knowledge Graph Completion via Pre-Trained Language
Models [53.09723678623779]
We propose TAGREAL to automatically generate quality query prompts and retrieve support information from large text corpora.
The results show that TAGREAL achieves state-of-the-art performance on two benchmark datasets.
We find that TAGREAL has superb performance even with limited training data, outperforming existing embedding-based, graph-based, and PLM-based methods.
arXiv Detail & Related papers (2023-05-24T22:09:35Z) - Informative Text Generation from Knowledge Triples [56.939571343797304]
We propose a novel memory augmented generator that employs a memory network to memorize the useful knowledge learned during the training.
We derive a dataset from WebNLG for our new setting and conduct extensive experiments to investigate the effectiveness of our model.
arXiv Detail & Related papers (2022-09-26T14:35:57Z) - Joint Language Semantic and Structure Embedding for Knowledge Graph
Completion [66.15933600765835]
We propose to jointly embed the semantics in the natural language description of the knowledge triplets with their structure information.
Our method embeds knowledge graphs for the completion task via fine-tuning pre-trained language models.
Our experiments on a variety of knowledge graph benchmarks have demonstrated the state-of-the-art performance of our method.
arXiv Detail & Related papers (2022-09-19T02:41:02Z) - Entity Context Graph: Learning Entity Representations
fromSemi-Structured Textual Sources on the Web [44.92858943475407]
We propose an approach that processes entity centric textual knowledge sources to learn entity embeddings.
We show that the embeddings learned from our approach are: (i) high quality and comparable to a known knowledge graph-based embeddings and can be used to improve them further.
arXiv Detail & Related papers (2021-03-29T20:52:14Z) - Commonsense Knowledge Mining from Term Definitions [0.20305676256390934]
We investigate a few machine learning approaches to mining commonsense knowledge triples using dictionary term definitions as inputs.
Our experiments show that term definitions contain some valid and novel commonsense knowledge triples for some semantic relations.
arXiv Detail & Related papers (2021-02-01T05:54:02Z) - ENT-DESC: Entity Description Generation by Exploring Knowledge Graph [53.03778194567752]
In practice, the input knowledge could be more than enough, since the output description may only cover the most significant knowledge.
We introduce a large-scale and challenging dataset to facilitate the study of such a practical scenario in KG-to-text.
We propose a multi-graph structure that is able to represent the original graph information more comprehensively.
arXiv Detail & Related papers (2020-04-30T14:16:19Z) - Structure-Augmented Text Representation Learning for Efficient Knowledge
Graph Completion [53.31911669146451]
Human-curated knowledge graphs provide critical supportive information to various natural language processing tasks.
These graphs are usually incomplete, urging auto-completion of them.
graph embedding approaches, e.g., TransE, learn structured knowledge via representing graph elements into dense embeddings.
textual encoding approaches, e.g., KG-BERT, resort to graph triple's text and triple-level contextualized representations.
arXiv Detail & Related papers (2020-04-30T13:50:34Z) - Exploiting Structured Knowledge in Text via Graph-Guided Representation
Learning [73.0598186896953]
We present two self-supervised tasks learning over raw text with the guidance from knowledge graphs.
Building upon entity-level masked language models, our first contribution is an entity masking scheme.
In contrast to existing paradigms, our approach uses knowledge graphs implicitly, only during pre-training.
arXiv Detail & Related papers (2020-04-29T14:22:42Z) - Knowledge-graph based Proactive Dialogue Generation with Improved
Meta-Learning [0.0]
We propose a knowledge graph based proactive dialogue generation model (KgDg) with three components.
For knowledge triplets embedding and selection, we formulate it as a problem of sentence embedding to better capture semantic information.
Our improved MAML algorithm is capable of learning general features from a limited number of knowledge graphs.
arXiv Detail & Related papers (2020-04-19T08:41:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.