The Structure and Dynamics of Knowledge Graphs, with Superficiality
- URL: http://arxiv.org/abs/2305.08116v4
- Date: Mon, 10 Jun 2024 16:02:44 GMT
- Title: The Structure and Dynamics of Knowledge Graphs, with Superficiality
- Authors: Loïck Lhote, Béatrice Markhoff, Arnaud Soulet,
- Abstract summary: This paper introduces the concept of superficiality, which controls the overlap between relationships whose facts are generated independently.
It is the first model for the structure and dynamics of knowledge graphs.
- Score: 0.016385815610837167
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Large knowledge graphs combine human knowledge garnered from projects ranging from academia and institutions to enterprises and crowdsourcing. Within such graphs, each relationship between two nodes represents a basic fact involving these two entities. The diversity of the semantics of relationships constitutes the richness of knowledge graphs, leading to the emergence of singular topologies, sometimes chaotic in appearance. However, this complex characteristic can be modeled in a simple way by introducing the concept of superficiality, which controls the overlap between relationships whose facts are generated independently. With this model, superficiality also regulates the balance of the global distribution of knowledge by determining the proportion of misdescribed entities. This is the first model for the structure and dynamics of knowledge graphs. It leads to a better understanding of formal knowledge acquisition and organization.
Related papers
- Two-dimensional Taxonomy for N-ary Knowledge Representation Learning Methods [0.12289361708127876]
This survey provides a comprehensive review of methods handling n-ary relational data, covering both knowledge hypergraphs and hyper-relational knowledge graphs literatures.<n>We propose a two-dimensional taxonomy: the first dimension categorises models based on their methodology, i.e., translation-based models, deep neural network-based models, logic rules-based models, and hyperedge expansion-based models.<n>The second dimension classifies models according to their awareness of entity roles and positions in n-ary relations, dividing them into aware-less, position-aware, and role-aware approaches.
arXiv Detail & Related papers (2025-06-05T22:59:39Z) - A Graph Perspective to Probe Structural Patterns of Knowledge in Large Language Models [52.52824699861226]
Large language models have been extensively studied as neural knowledge bases for their knowledge access, editability, reasoning, and explainability.<n>We quantify the knowledge of LLMs at both the triplet and entity levels, and analyze how it relates to graph structural properties such as node degree.
arXiv Detail & Related papers (2025-05-25T19:34:15Z) - Neural-Symbolic Reasoning over Knowledge Graphs: A Survey from a Query Perspective [55.79507207292647]
Knowledge graph reasoning is pivotal in various domains such as data mining, artificial intelligence, the Web, and social sciences.
The rise of Neural AI marks a significant advancement, merging the robustness of deep learning with the precision of symbolic reasoning.
The advent of large language models (LLMs) has opened new frontiers in knowledge graph reasoning.
arXiv Detail & Related papers (2024-11-30T18:54:08Z) - Foundations and Frontiers of Graph Learning Theory [81.39078977407719]
Recent advancements in graph learning have revolutionized the way to understand and analyze data with complex structures.
Graph Neural Networks (GNNs), i.e. neural network architectures designed for learning graph representations, have become a popular paradigm.
This article provides a comprehensive summary of the theoretical foundations and breakthroughs concerning the approximation and learning behaviors intrinsic to prevalent graph learning models.
arXiv Detail & Related papers (2024-07-03T14:07:41Z) - Federated Graph Semantic and Structural Learning [54.97668931176513]
This paper reveals that local client distortion is brought by both node-level semantics and graph-level structure.
We postulate that a well-structural graph neural network possesses similarity for neighbors due to the inherent adjacency relationships.
We transform the adjacency relationships into the similarity distribution and leverage the global model to distill the relation knowledge into the local model.
arXiv Detail & Related papers (2024-06-27T07:08:28Z) - Accelerating Scientific Discovery with Generative Knowledge Extraction, Graph-Based Representation, and Multimodal Intelligent Graph Reasoning [0.0]
We have transformed a dataset comprising 1,000 scientific papers into an ontological knowledge graph.
We have calculated node degrees, identified communities and connectivities, and evaluated clustering coefficients and betweenness centrality of pivotal nodes.
The graph has an inherently scale-free nature, is highly connected, and can be used for graph reasoning.
arXiv Detail & Related papers (2024-03-18T17:30:27Z) - Rule-Guided Joint Embedding Learning over Knowledge Graphs [6.831227021234669]
This paper introduces a novel model that incorporates both contextual and literal information into entity and relation embeddings.
For contextual information, we assess its significance through confidence and relatedness metrics.
We validate our model performance with thorough experiments on two established benchmark datasets.
arXiv Detail & Related papers (2023-12-01T19:58:31Z) - Self-organization Preserved Graph Structure Learning with Principle of
Relevant Information [72.83485174169027]
PRI-GSL is a Graph Structure Learning framework for identifying the self-organization and revealing the hidden structure.
PRI-GSL learns a structure that contains the most relevant yet least redundant information quantified by von Neumann entropy and Quantum Jensen-Shannon divergence.
arXiv Detail & Related papers (2022-12-30T16:02:02Z) - Learning Representations of Entities and Relations [0.0]
This thesis focuses on improving knowledge graph representation with the aim of tackling the link prediction task.
The first contribution is HypER, a convolutional model which simplifies and improves upon the link prediction performance.
The second contribution is TuckER, a relatively straightforward linear model, which, at the time of its introduction, obtained state-of-the-art link prediction performance.
The third contribution is MuRP, first multi-relational graph representation model embedded in hyperbolic space.
arXiv Detail & Related papers (2022-01-31T09:24:43Z) - Knowledge Sheaves: A Sheaf-Theoretic Framework for Knowledge Graph
Embedding [1.5469452301122175]
We show that knowledge graph embedding is naturally expressed in the topological and categorical language of textitcellular sheaves
A knowledge graph embedding can be described as an approximate global section of an appropriate textitknowledge sheaf over the graph.
The resulting embeddings can be easily adapted for reasoning over composite relations without special training.
arXiv Detail & Related papers (2021-10-07T20:54:40Z) - Is There More Pattern in Knowledge Graph? Exploring Proximity Pattern
for Knowledge Graph Embedding [13.17623081024394]
We name such semantic phenomenon in knowledge graph as proximity pattern.
With the original knowledge graph, we design a Chained couPle-GNN architecture to deeply merge the two patterns.
Being evaluated on FB15k-237 and WN18RR datasets, CP-GNN achieves state-of-the-art results for Knowledge Graph Completion task.
arXiv Detail & Related papers (2021-10-02T03:50:42Z) - Compositional Processing Emerges in Neural Networks Solving Math
Problems [100.80518350845668]
Recent progress in artificial neural networks has shown that when large models are trained on enough linguistic data, grammatical structure emerges in their representations.
We extend this work to the domain of mathematical reasoning, where it is possible to formulate precise hypotheses about how meanings should be composed.
Our work shows that neural networks are not only able to infer something about the structured relationships implicit in their training data, but can also deploy this knowledge to guide the composition of individual meanings into composite wholes.
arXiv Detail & Related papers (2021-05-19T07:24:42Z) - Structural Landmarking and Interaction Modelling: on Resolution Dilemmas
in Graph Classification [50.83222170524406]
We study the intrinsic difficulty in graph classification under the unified concept of resolution dilemmas''
We propose SLIM'', an inductive neural network model for Structural Landmarking and Interaction Modelling.
arXiv Detail & Related papers (2020-06-29T01:01:42Z) - Neural-Symbolic Relational Reasoning on Graph Models: Effective Link
Inference and Computation from Knowledge Bases [0.5669790037378094]
We propose a neural-symbolic graph which applies learning over all the paths by feeding the model with the embedding of the minimal network of the knowledge graph containing such paths.
By learning to produce representations for entities and facts corresponding to word embeddings, we show how the model can be trained end-to-end to decode these representations and infer relations between entities in a relational approach.
arXiv Detail & Related papers (2020-05-05T22:46:39Z) - A Heterogeneous Graph with Factual, Temporal and Logical Knowledge for
Question Answering Over Dynamic Contexts [81.4757750425247]
We study question answering over a dynamic textual environment.
We develop a graph neural network over the constructed graph, and train the model in an end-to-end manner.
arXiv Detail & Related papers (2020-04-25T04:53:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.