Contrastive Cross-Course Knowledge Tracing via Concept Graph Guided Knowledge Transfer
- URL: http://arxiv.org/abs/2505.13489v1
- Date: Wed, 14 May 2025 10:38:30 GMT
- Title: Contrastive Cross-Course Knowledge Tracing via Concept Graph Guided Knowledge Transfer
- Authors: Wenkang Han, Wang Lin, Liya Hu, Zhenlong Dai, Yiyun Zhou, Mengze Li, Zemin Liu, Chang Yao, Jingyuan Chen,
- Abstract summary: We propose TransKT, a contrastive cross-course knowledge tracing method.<n>It builds on concept graph guided knowledge transfer to model the relationships between learning behaviors across different courses.<n>TransKT employs a contrastive objective that aligns single-course and cross-course knowledge states.
- Score: 12.34590941832835
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Knowledge tracing (KT) aims to predict learners' future performance based on historical learning interactions. However, existing KT models predominantly focus on data from a single course, limiting their ability to capture a comprehensive understanding of learners' knowledge states. In this paper, we propose TransKT, a contrastive cross-course knowledge tracing method that leverages concept graph guided knowledge transfer to model the relationships between learning behaviors across different courses, thereby enhancing knowledge state estimation. Specifically, TransKT constructs a cross-course concept graph by leveraging zero-shot Large Language Model (LLM) prompts to establish implicit links between related concepts across different courses. This graph serves as the foundation for knowledge transfer, enabling the model to integrate and enhance the semantic features of learners' interactions across courses. Furthermore, TransKT includes an LLM-to-LM pipeline for incorporating summarized semantic features, which significantly improves the performance of Graph Convolutional Networks (GCNs) used for knowledge transfer. Additionally, TransKT employs a contrastive objective that aligns single-course and cross-course knowledge states, thereby refining the model's ability to provide a more robust and accurate representation of learners' overall knowledge states.
Related papers
- TLSQKT: A Question-Aware Dual-Channel Transformer for Literacy Tracing from Learning Sequences [4.119581464024065]
Knowledge tracing supports personalized learning by modeling how students' knowledge states evolve over time.<n>We instantiate this paradigm with a Transformer-based model, TLSQKT (Transformer for Learning Sequences with Question-Aware Knowledge Tracing)<n> TLSQKT employs a dual-channel design that jointly encodes student responses and item semantics, while question-aware interaction and self-attention capture long-range dependencies in learners' evolving states.
arXiv Detail & Related papers (2025-10-26T02:07:45Z) - Language Guided Concept Bottleneck Models for Interpretable Continual Learning [62.09201360376577]
Continual learning aims to enable learning systems to acquire new knowledge constantly without forgetting previously learned information.<n>Most existing CL methods focus primarily on preserving learned knowledge to improve model performance.<n>We introduce a novel framework that integrates language-guided Concept Bottleneck Models to address both challenges.
arXiv Detail & Related papers (2025-03-30T02:41:55Z) - Knowledge Graph Enhanced Generative Multi-modal Models for Class-Incremental Learning [51.0864247376786]
We introduce a Knowledge Graph Enhanced Generative Multi-modal model (KG-GMM) that builds an evolving knowledge graph throughout the learning process.<n>During testing, we propose a Knowledge Graph Augmented Inference method that locates specific categories by analyzing relationships within the generated text.
arXiv Detail & Related papers (2025-03-24T07:20:43Z) - Exploiting the Semantic Knowledge of Pre-trained Text-Encoders for Continual Learning [70.64617500380287]
Continual learning allows models to learn from new data while retaining previously learned knowledge.
The semantic knowledge available in the label information of the images, offers important semantic information that can be related with previously acquired knowledge of semantic classes.
We propose integrating semantic guidance within and across tasks by capturing semantic similarity using text embeddings.
arXiv Detail & Related papers (2024-08-02T07:51:44Z) - SINKT: A Structure-Aware Inductive Knowledge Tracing Model with Large Language Model [64.92472567841105]
Knowledge Tracing (KT) aims to determine whether students will respond correctly to the next question.
Structure-aware Inductive Knowledge Tracing model with large language model (dubbed SINKT)
SINKT predicts the student's response to the target question by interacting with the student's knowledge state and the question representation.
arXiv Detail & Related papers (2024-07-01T12:44:52Z) - A Condensed Transition Graph Framework for Zero-shot Link Prediction with Large Language Models [20.220781775335645]
We introduce a Condensed Transition Graph Framework for Zero-Shot Link Prediction (CTLP)
CTLP encodes all the paths' information in linear time complexity to predict unseen relations between entities.
Our proposed CTLP method achieves state-of-the-art performance on three standard ZSLP datasets.
arXiv Detail & Related papers (2024-02-16T16:02:33Z) - Faithful Path Language Modeling for Explainable Recommendation over Knowledge Graph [15.40937702266105]
We introduce PEARLM (Path-based Explainable-Accurate Recommender based on Language Modelling), which innovates with a Knowledge Graph Constraint Decoding (KGCD) mechanism.
This mechanism ensures zero incidence of corrupted paths by enforcing adherence to valid KG connections at the decoding level.
We validate the effectiveness of our approach through a rigorous empirical assessment, employing a newly proposed metric that quantifies the integrity of explanation paths.
arXiv Detail & Related papers (2023-10-25T08:14:49Z) - Knowledge Distillation via Token-level Relationship Graph [12.356770685214498]
We propose a novel method called Knowledge Distillation with Token-level Relationship Graph (TRG)
By employing TRG, the student model can effectively emulate higher-level semantic information from the teacher model.
We conduct experiments to evaluate the effectiveness of the proposed method against several state-of-the-art approaches.
arXiv Detail & Related papers (2023-06-20T08:16:37Z) - Recognizing Unseen Objects via Multimodal Intensive Knowledge Graph
Propagation [68.13453771001522]
We propose a multimodal intensive ZSL framework that matches regions of images with corresponding semantic embeddings.
We conduct extensive experiments and evaluate our model on large-scale real-world data.
arXiv Detail & Related papers (2023-06-14T13:07:48Z) - Jointly Learning Knowledge Embedding and Neighborhood Consensus with
Relational Knowledge Distillation for Entity Alignment [9.701081498310165]
Entity alignment aims at integrating heterogeneous knowledge from different knowledge graphs.
Recent studies employ embedding-based methods by first learning representation of Knowledge Graphs and then performing entity alignment.
We propose a Graph Convolutional Network (GCN) model equipped with knowledge distillation for entity alignment.
arXiv Detail & Related papers (2022-01-25T02:47:14Z) - Semantic TrueLearn: Using Semantic Knowledge Graphs in Recommendation
Systems [22.387120578306277]
This work aims to advance towards building a state-aware educational recommendation system that incorporates semantic relatedness.
We introduce a novel learner model that exploits this semantic relatedness between knowledge components in learning resources using the Wikipedia link graph.
Our experiments with a large dataset demonstrate that this new semantic version of TrueLearn algorithm achieves statistically significant improvements in terms of predictive performance.
arXiv Detail & Related papers (2021-12-08T16:23:27Z) - KEML: A Knowledge-Enriched Meta-Learning Framework for Lexical Relation
Classification [37.2106265998237]
Lexical relations describe how concepts are semantically related, in the form of relation triples.
We propose the Knowledge-Enriched Meta-Learning framework to address the task of lexical relation classification.
arXiv Detail & Related papers (2020-02-25T14:43:56Z) - Generative Adversarial Zero-Shot Relational Learning for Knowledge
Graphs [96.73259297063619]
We consider a novel formulation, zero-shot learning, to free this cumbersome curation.
For newly-added relations, we attempt to learn their semantic features from their text descriptions.
We leverage Generative Adrial Networks (GANs) to establish the connection between text and knowledge graph domain.
arXiv Detail & Related papers (2020-01-08T01:19:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.