Comprehensive Event Representations using Event Knowledge Graphs and
Natural Language Processing
- URL: http://arxiv.org/abs/2303.04794v1
- Date: Wed, 8 Mar 2023 18:43:39 GMT
- Title: Comprehensive Event Representations using Event Knowledge Graphs and
Natural Language Processing
- Authors: Tin Kuculo
- Abstract summary: This work seeks to utilise and build on the growing body of work that uses findings from the field of natural language processing (NLP) to extract knowledge from text and build knowledge graphs.
Specifically, sub-event extraction is used as a way of creating sub-event-aware event representations.
These event representations are enriched through fine-grained location extraction and contextualised through the alignment of historically relevant quotes.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent work has utilised knowledge-aware approaches to natural language
understanding, question answering, recommendation systems, and other tasks.
These approaches rely on well-constructed and large-scale knowledge graphs that
can be useful for many downstream applications and empower knowledge-aware
models with commonsense reasoning. Such knowledge graphs are constructed
through knowledge acquisition tasks such as relation extraction and knowledge
graph completion. This work seeks to utilise and build on the growing body of
work that uses findings from the field of natural language processing (NLP) to
extract knowledge from text and build knowledge graphs. The focus of this
research project is on how we can use transformer-based approaches to extract
and contextualise event information, matching it to existing ontologies, to
build a comprehensive knowledge of graph-based event representations.
Specifically, sub-event extraction is used as a way of creating sub-event-aware
event representations. These event representations are then further enriched
through fine-grained location extraction and contextualised through the
alignment of historically relevant quotes.
Related papers
- Knowledge Graphs and Pre-trained Language Models enhanced Representation Learning for Conversational Recommender Systems [58.561904356651276]
We introduce the Knowledge-Enhanced Entity Representation Learning (KERL) framework to improve the semantic understanding of entities for Conversational recommender systems.
KERL uses a knowledge graph and a pre-trained language model to improve the semantic understanding of entities.
KERL achieves state-of-the-art results in both recommendation and response generation tasks.
arXiv Detail & Related papers (2023-12-18T06:41:23Z) - Exploring In-Context Learning Capabilities of Foundation Models for
Generating Knowledge Graphs from Text [3.114960935006655]
This paper aims to improve the state of the art of automatic construction and completion of knowledge graphs from text.
In this context, one emerging paradigm is in-context learning where a language model is used as it is with a prompt.
arXiv Detail & Related papers (2023-05-15T17:10:19Z) - Joint Language Semantic and Structure Embedding for Knowledge Graph
Completion [66.15933600765835]
We propose to jointly embed the semantics in the natural language description of the knowledge triplets with their structure information.
Our method embeds knowledge graphs for the completion task via fine-tuning pre-trained language models.
Our experiments on a variety of knowledge graph benchmarks have demonstrated the state-of-the-art performance of our method.
arXiv Detail & Related papers (2022-09-19T02:41:02Z) - FabKG: A Knowledge graph of Manufacturing Science domain utilizing
structured and unconventional unstructured knowledge source [1.2597961235465307]
We develop knowledge graphs based upon entity and relation data for both commercial and educational uses.
We propose a novel crowdsourcing method for KG creation by leveraging student notes.
We have created a knowledge graph containing 65000+ triples using all data sources.
arXiv Detail & Related papers (2022-05-24T02:32:04Z) - TegTok: Augmenting Text Generation via Task-specific and Open-world
Knowledge [83.55215993730326]
We propose augmenting TExt Generation via Task-specific and Open-world Knowledge (TegTok) in a unified framework.
Our model selects knowledge entries from two types of knowledge sources through dense retrieval and then injects them into the input encoding and output decoding stages respectively.
arXiv Detail & Related papers (2022-03-16T10:37:59Z) - Knowledge Graph Enhanced Event Extraction in Financial Documents [0.12891210250935145]
We propose a first event extraction framework that embeds a knowledge graph through a Graph Neural Network.
For extracting events from Chinese financial announcements, our method outperforms the state-of-the-art method by 5.3% in F1-score.
arXiv Detail & Related papers (2021-09-06T16:35:15Z) - CoLAKE: Contextualized Language and Knowledge Embedding [81.90416952762803]
We propose the Contextualized Language and Knowledge Embedding (CoLAKE)
CoLAKE jointly learns contextualized representation for both language and knowledge with the extended objective.
We conduct experiments on knowledge-driven tasks, knowledge probing tasks, and language understanding tasks.
arXiv Detail & Related papers (2020-10-01T11:39:32Z) - ENT-DESC: Entity Description Generation by Exploring Knowledge Graph [53.03778194567752]
In practice, the input knowledge could be more than enough, since the output description may only cover the most significant knowledge.
We introduce a large-scale and challenging dataset to facilitate the study of such a practical scenario in KG-to-text.
We propose a multi-graph structure that is able to represent the original graph information more comprehensively.
arXiv Detail & Related papers (2020-04-30T14:16:19Z) - Exploiting Structured Knowledge in Text via Graph-Guided Representation
Learning [73.0598186896953]
We present two self-supervised tasks learning over raw text with the guidance from knowledge graphs.
Building upon entity-level masked language models, our first contribution is an entity masking scheme.
In contrast to existing paradigms, our approach uses knowledge graphs implicitly, only during pre-training.
arXiv Detail & Related papers (2020-04-29T14:22:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.