Semantic TrueLearn: Using Semantic Knowledge Graphs in Recommendation
Systems
- URL: http://arxiv.org/abs/2112.04368v1
- Date: Wed, 8 Dec 2021 16:23:27 GMT
- Title: Semantic TrueLearn: Using Semantic Knowledge Graphs in Recommendation
Systems
- Authors: Sahan Bulathwela, Mar\'ia P\'erez-Ortiz, Emine Yilmaz, John
Shawe-Taylor
- Abstract summary: This work aims to advance towards building a state-aware educational recommendation system that incorporates semantic relatedness.
We introduce a novel learner model that exploits this semantic relatedness between knowledge components in learning resources using the Wikipedia link graph.
Our experiments with a large dataset demonstrate that this new semantic version of TrueLearn algorithm achieves statistically significant improvements in terms of predictive performance.
- Score: 22.387120578306277
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In informational recommenders, many challenges arise from the need to handle
the semantic and hierarchical structure between knowledge areas. This work aims
to advance towards building a state-aware educational recommendation system
that incorporates semantic relatedness between knowledge topics, propagating
latent information across semantically related topics. We introduce a novel
learner model that exploits this semantic relatedness between knowledge
components in learning resources using the Wikipedia link graph, with the aim
to better predict learner engagement and latent knowledge in a lifelong
learning scenario. In this sense, Semantic TrueLearn builds a humanly intuitive
knowledge representation while leveraging Bayesian machine learning to improve
the predictive performance of the educational engagement. Our experiments with
a large dataset demonstrate that this new semantic version of TrueLearn
algorithm achieves statistically significant improvements in terms of
predictive performance with a simple extension that adds semantic awareness to
the model.
Related papers
- Exploiting the Semantic Knowledge of Pre-trained Text-Encoders for Continual Learning [70.64617500380287]
Continual learning allows models to learn from new data while retaining previously learned knowledge.
The semantic knowledge available in the label information of the images, offers important semantic information that can be related with previously acquired knowledge of semantic classes.
We propose integrating semantic guidance within and across tasks by capturing semantic similarity using text embeddings.
arXiv Detail & Related papers (2024-08-02T07:51:44Z) - Informed Meta-Learning [55.2480439325792]
Meta-learning and informed ML stand out as two approaches for incorporating prior knowledge into ML pipelines.
We formalise a hybrid paradigm, informed meta-learning, facilitating the incorporation of priors from unstructured knowledge representations.
We demonstrate the potential benefits of informed meta-learning in improving data efficiency, robustness to observational noise and task distribution shifts.
arXiv Detail & Related papers (2024-02-25T15:08:37Z) - Joint Language Semantic and Structure Embedding for Knowledge Graph
Completion [66.15933600765835]
We propose to jointly embed the semantics in the natural language description of the knowledge triplets with their structure information.
Our method embeds knowledge graphs for the completion task via fine-tuning pre-trained language models.
Our experiments on a variety of knowledge graph benchmarks have demonstrated the state-of-the-art performance of our method.
arXiv Detail & Related papers (2022-09-19T02:41:02Z) - Ontology-enhanced Prompt-tuning for Few-shot Learning [41.51144427728086]
Few-shot Learning is aimed to make predictions based on a limited number of samples.
Structured data such as knowledge graphs and ontology libraries has been leveraged to benefit the few-shot setting in various tasks.
arXiv Detail & Related papers (2022-01-27T05:41:36Z) - Knowledge Graph Augmented Network Towards Multiview Representation
Learning for Aspect-based Sentiment Analysis [96.53859361560505]
We propose a knowledge graph augmented network (KGAN) to incorporate external knowledge with explicitly syntactic and contextual information.
KGAN captures the sentiment feature representations from multiple perspectives, i.e., context-, syntax- and knowledge-based.
Experiments on three popular ABSA benchmarks demonstrate the effectiveness and robustness of our KGAN.
arXiv Detail & Related papers (2022-01-13T08:25:53Z) - On the Effects of Knowledge-Augmented Data in Word Embeddings [0.6749750044497732]
We propose a novel approach for linguistic knowledge injection through data augmentation to learn word embeddings.
We show our knowledge augmentation approach improves the intrinsic characteristics of the learned embeddings while not significantly altering their results on a downstream text classification task.
arXiv Detail & Related papers (2020-10-05T02:14:13Z) - Exploiting Structured Knowledge in Text via Graph-Guided Representation
Learning [73.0598186896953]
We present two self-supervised tasks learning over raw text with the guidance from knowledge graphs.
Building upon entity-level masked language models, our first contribution is an entity masking scheme.
In contrast to existing paradigms, our approach uses knowledge graphs implicitly, only during pre-training.
arXiv Detail & Related papers (2020-04-29T14:22:42Z) - Generative Adversarial Zero-Shot Relational Learning for Knowledge
Graphs [96.73259297063619]
We consider a novel formulation, zero-shot learning, to free this cumbersome curation.
For newly-added relations, we attempt to learn their semantic features from their text descriptions.
We leverage Generative Adrial Networks (GANs) to establish the connection between text and knowledge graph domain.
arXiv Detail & Related papers (2020-01-08T01:19:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.