Knowledge Graph informed Fake News Classification via Heterogeneous
Representation Ensembles
- URL: http://arxiv.org/abs/2110.10457v1
- Date: Wed, 20 Oct 2021 09:41:14 GMT
- Title: Knowledge Graph informed Fake News Classification via Heterogeneous
Representation Ensembles
- Authors: Boshko Koloski and Timen Stepi\v{s}nik-Perdih and Marko
Robnik-\v{S}ikonja and Senja Pollak and Bla\v{z} \v{S}krlj
- Abstract summary: We show how different document representations can be used for efficient fake news identification.
One of the key contributions is a set of novel document representation learning methods based solely on knowledge graphs.
We demonstrate that knowledge graph-based representations already achieve competitive performance to conventionally accepted representation learners.
- Score: 1.8374319565577157
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Increasing amounts of freely available data both in textual and relational
form offers exploration of richer document representations, potentially
improving the model performance and robustness. An emerging problem in the
modern era is fake news detection -- many easily available pieces of
information are not necessarily factually correct, and can lead to wrong
conclusions or are used for manipulation. In this work we explore how different
document representations, ranging from simple symbolic bag-of-words, to
contextual, neural language model-based ones can be used for efficient fake
news identification. One of the key contributions is a set of novel document
representation learning methods based solely on knowledge graphs, i.e.
extensive collections of (grounded) subject-predicate-object triplets. We
demonstrate that knowledge graph-based representations already achieve
competitive performance to conventionally accepted representation learners.
Furthermore, when combined with existing, contextual representations, knowledge
graph-based document representations can achieve state-of-the-art performance.
To our knowledge this is the first larger-scale evaluation of how knowledge
graph-based representations can be systematically incorporated into the process
of fake news classification.
Related papers
- From Latent to Lucid: Transforming Knowledge Graph Embeddings into Interpretable Structures [2.6451388057494283]
This paper introduces a post-hoc explainable AI method tailored for Knowledge Graph Embedding models.
Our approach directly decodes the latent representations encoded by Knowledge Graph Embedding models.
By identifying distinct structures within the subgraph neighborhoods of similarly embedded entities, our method translates these insights into human-understandable symbolic rules and facts.
arXiv Detail & Related papers (2024-06-03T19:54:11Z) - Rule-Guided Joint Embedding Learning over Knowledge Graphs [6.831227021234669]
This paper introduces a novel model that incorporates both contextual and literal information into entity and relation embeddings.
For contextual information, we assess its significance through confidence and relatedness metrics.
We validate our model performance with thorough experiments on two established benchmark datasets.
arXiv Detail & Related papers (2023-12-01T19:58:31Z) - Joint Language Semantic and Structure Embedding for Knowledge Graph
Completion [66.15933600765835]
We propose to jointly embed the semantics in the natural language description of the knowledge triplets with their structure information.
Our method embeds knowledge graphs for the completion task via fine-tuning pre-trained language models.
Our experiments on a variety of knowledge graph benchmarks have demonstrated the state-of-the-art performance of our method.
arXiv Detail & Related papers (2022-09-19T02:41:02Z) - Integrating Knowledge Graph embedding and pretrained Language Models in
Hypercomplex Spaces [29.71047521165222]
We build on existing strong representations of single modalities and use hypercomplex algebra to represent both, (i), single-modality embedding as well as, (ii) the interaction between different modalities.
More specifically, we suggest Dihedron and Quaternion representations of 4D hypercomplex numbers to integrate four modalities namely structural knowledge graph embedding, word-level representations and document-level representations.
Our unified vector representation scores the plausibility of labelled edges via Hamilton and Dihedron products, thus modeling pairwise interactions between different modalities.
arXiv Detail & Related papers (2022-08-04T16:18:16Z) - Sparse Structure Learning via Graph Neural Networks for Inductive
Document Classification [2.064612766965483]
We propose a novel GNN-based sparse structure learning model for inductive document classification.
Our model collects a set of trainable edges connecting disjoint words between sentences and employs structure learning to sparsely select edges with dynamic contextual dependencies.
Experiments on several real-world datasets demonstrate that the proposed model outperforms most state-of-the-art results.
arXiv Detail & Related papers (2021-12-13T02:36:04Z) - Integrating Semantics and Neighborhood Information with Graph-Driven
Generative Models for Document Retrieval [51.823187647843945]
In this paper, we encode the neighborhood information with a graph-induced Gaussian distribution, and propose to integrate the two types of information with a graph-driven generative model.
Under the approximation, we prove that the training objective can be decomposed into terms involving only singleton or pairwise documents, enabling the model to be trained as efficiently as uncorrelated ones.
arXiv Detail & Related papers (2021-05-27T11:29:03Z) - Be More with Less: Hypergraph Attention Networks for Inductive Text
Classification [56.98218530073927]
Graph neural networks (GNNs) have received increasing attention in the research community and demonstrated their promising results on this canonical task.
Despite the success, their performance could be largely jeopardized in practice since they are unable to capture high-order interaction between words.
We propose a principled model -- hypergraph attention networks (HyperGAT) which can obtain more expressive power with less computational consumption for text representation learning.
arXiv Detail & Related papers (2020-11-01T00:21:59Z) - Leveraging Graph to Improve Abstractive Multi-Document Summarization [50.62418656177642]
We develop a neural abstractive multi-document summarization (MDS) model which can leverage well-known graph representations of documents.
Our model utilizes graphs to encode documents in order to capture cross-document relations, which is crucial to summarizing long documents.
Our model can also take advantage of graphs to guide the summary generation process, which is beneficial for generating coherent and concise summaries.
arXiv Detail & Related papers (2020-05-20T13:39:47Z) - Structure-Augmented Text Representation Learning for Efficient Knowledge
Graph Completion [53.31911669146451]
Human-curated knowledge graphs provide critical supportive information to various natural language processing tasks.
These graphs are usually incomplete, urging auto-completion of them.
graph embedding approaches, e.g., TransE, learn structured knowledge via representing graph elements into dense embeddings.
textual encoding approaches, e.g., KG-BERT, resort to graph triple's text and triple-level contextualized representations.
arXiv Detail & Related papers (2020-04-30T13:50:34Z) - Exploiting Structured Knowledge in Text via Graph-Guided Representation
Learning [73.0598186896953]
We present two self-supervised tasks learning over raw text with the guidance from knowledge graphs.
Building upon entity-level masked language models, our first contribution is an entity masking scheme.
In contrast to existing paradigms, our approach uses knowledge graphs implicitly, only during pre-training.
arXiv Detail & Related papers (2020-04-29T14:22:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.