A Short Review for Ontology Learning: Stride to Large Language Models Trend
- URL: http://arxiv.org/abs/2404.14991v2
- Date: Mon, 17 Jun 2024 05:48:22 GMT
- Title: A Short Review for Ontology Learning: Stride to Large Language Models Trend
- Authors: Rick Du, Huilong An, Keyu Wang, Weidong Liu,
- Abstract summary: Ontologies provide formal representation of knowledge shared within Web applications.
New trend of approaches is relying on large language models (LLMs) to enhance ontology learning.
- Score: 1.7142222335232333
- License:
- Abstract: Ontologies provide formal representation of knowledge shared within Semantic Web applications. Ontology learning involves the construction of ontologies from a given corpus. In the past years, ontology learning has traversed through shallow learning and deep learning methodologies, each offering distinct advantages and limitations in the quest for knowledge extraction and representation. A new trend of these approaches is relying on large language models (LLMs) to enhance ontology learning. This paper gives a review in approaches and challenges of ontology learning. It analyzes the methodologies and limitations of shallow-learning-based and deep-learning-based techniques for ontology learning, and provides comprehensive knowledge for the frontier work of using LLMs to enhance ontology learning. In addition, it proposes several noteworthy future directions for further exploration into the integration of LLMs with ontology learning tasks.
Related papers
- Knowledge Boundary of Large Language Models: A Survey [75.67848187449418]
Large language models (LLMs) store vast amount of knowledge in their parameters, but they still have limitations in the memorization and utilization of certain knowledge.
This highlights the critical need to understand the knowledge boundary of LLMs, a concept that remains inadequately defined in existing research.
We propose a comprehensive definition of the LLM knowledge boundary and introduce a formalized taxonomy categorizing knowledge into four distinct types.
arXiv Detail & Related papers (2024-12-17T02:14:02Z) - Ontology Embedding: A Survey of Methods, Applications and Resources [54.3453925775069]
Onologies are widely used for representing domain knowledge and meta data.
logical reasoning that can directly support are quite limited in learning, approximation and prediction.
One straightforward solution is to integrate statistical analysis and machine learning.
arXiv Detail & Related papers (2024-06-16T14:49:19Z) - Exploring the landscape of large language models: Foundations, techniques, and challenges [8.042562891309414]
The article sheds light on the mechanics of in-context learning and a spectrum of fine-tuning approaches.
It explores how LLMs can be more closely aligned with human preferences through innovative reinforcement learning frameworks.
The ethical dimensions of LLM deployment are discussed, underscoring the need for mindful and responsible application.
arXiv Detail & Related papers (2024-04-18T08:01:20Z) - Language Evolution with Deep Learning [49.879239655532324]
Computational modeling plays an essential role in the study of language emergence.
It aims to simulate the conditions and learning processes that could trigger the emergence of a structured language.
This chapter explores another class of computational models that have recently revolutionized the field of machine learning: deep learning models.
arXiv Detail & Related papers (2024-03-18T16:52:54Z) - Knowledge-augmented Deep Learning and Its Applications: A Survey [60.221292040710885]
knowledge-augmented deep learning (KADL) aims to identify domain knowledge and integrate it into deep models for data-efficient, generalizable, and interpretable deep learning.
This survey subsumes existing works and offers a bird's-eye view of research in the general area of knowledge-augmented deep learning.
arXiv Detail & Related papers (2022-11-30T03:44:15Z) - Learning Description Logic Ontologies. Five Approaches. Where Do They
Stand? [14.650545418986058]
We highlight machine learning and data mining approaches that have been proposed for the creation of description logic (DL)
These are based on association rule mining, formal concept analysis, inductive logic programming, computational learning theory, and neural networks.
arXiv Detail & Related papers (2021-04-02T18:36:45Z) - On the Complexity of Learning Description Logic Ontologies [14.650545418986058]
Ontologies are a popular way of representing domain knowledge, in particular, knowledge in domains related to life sciences.
We provide a formal specification of the exact and the probably correct learning models from learning theory.
arXiv Detail & Related papers (2021-03-25T09:18:12Z) - Emerging Trends in Federated Learning: From Model Fusion to Federated X Learning [65.06445195580622]
Federated learning is a new paradigm that decouples data collection and model training via multi-party computation and model aggregation.
We conduct a focused survey of federated learning in conjunction with other learning algorithms.
arXiv Detail & Related papers (2021-02-25T15:18:13Z) - Online Structured Meta-learning [137.48138166279313]
Current online meta-learning algorithms are limited to learn a globally-shared meta-learner.
We propose an online structured meta-learning (OSML) framework to overcome this limitation.
Experiments on three datasets demonstrate the effectiveness and interpretability of our proposed framework.
arXiv Detail & Related papers (2020-10-22T09:10:31Z) - Meta-Learning in Neural Networks: A Survey [4.588028371034406]
This survey describes the contemporary meta-learning landscape.
We first discuss definitions of meta-learning and position it with respect to related fields.
We then propose a new taxonomy that provides a more comprehensive breakdown of the space of meta-learning methods.
arXiv Detail & Related papers (2020-04-11T16:34:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.