A Survey of Knowledge Tracing: Models, Variants, and Applications
- URL: http://arxiv.org/abs/2105.15106v4
- Date: Thu, 11 Apr 2024 02:57:17 GMT
- Title: A Survey of Knowledge Tracing: Models, Variants, and Applications
- Authors: Shuanghong Shen, Qi Liu, Zhenya Huang, Yonghe Zheng, Minghao Yin, Minjuan Wang, Enhong Chen,
- Abstract summary: Knowledge Tracing is one of the fundamental tasks for student behavioral data analysis.
We present three types of fundamental KT models with distinct technical routes.
We discuss potential directions for future research in this rapidly growing field.
- Score: 70.69281873057619
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Modern online education has the capacity to provide intelligent educational services by automatically analyzing substantial amounts of student behavioral data. Knowledge Tracing (KT) is one of the fundamental tasks for student behavioral data analysis, aiming to monitor students' evolving knowledge state during their problem-solving process. In recent years, a substantial number of studies have concentrated on this rapidly growing field, significantly contributing to its advancements. In this survey, we will conduct a thorough investigation of these progressions. Firstly, we present three types of fundamental KT models with distinct technical routes. Subsequently, we review extensive variants of the fundamental KT models that consider more stringent learning assumptions. Moreover, the development of KT cannot be separated from its applications, thereby we present typical KT applications in various scenarios. To facilitate the work of researchers and practitioners in this field, we have developed two open-source algorithm libraries: EduData that enables the download and preprocessing of KT-related datasets, and EduKTM that provides an extensible and unified implementation of existing mainstream KT models. Finally, we discuss potential directions for future research in this rapidly growing field. We hope that the current survey will assist both researchers and practitioners in fostering the development of KT, thereby benefiting a broader range of students.
Related papers
- SINKT: A Structure-Aware Inductive Knowledge Tracing Model with Large Language Model [64.92472567841105]
Knowledge Tracing (KT) aims to determine whether students will respond correctly to the next question.
Structure-aware Inductive Knowledge Tracing model with large language model (dubbed SINKT)
SINKT predicts the student's response to the target question by interacting with the student's knowledge state and the question representation.
arXiv Detail & Related papers (2024-07-01T12:44:52Z) - Large Language Model Enhanced Knowledge Representation Learning: A Survey [15.602891714371342]
The integration of Large Language Models with Knowledge Representation Learning (KRL) signifies a significant advancement in the field of artificial intelligence (AI)
Despite the increasing research on enhancing KRL with LLMs, a thorough survey that analyse processes of these enhanced models is conspicuously absent.
Our survey addresses this by categorizing these models based on three distinct Transformer architectures, and by analyzing experimental data from various KRL downstream tasks to evaluate the strengths and weaknesses of each approach.
arXiv Detail & Related papers (2024-07-01T03:37:35Z) - A Survey on Deep Active Learning: Recent Advances and New Frontiers [27.07154361976248]
This work aims to serve as a useful and quick guide for researchers in overcoming difficulties in deep learning-based active learning (DAL)
This technique has gained increasing popularity due to its broad applicability, yet its survey papers, especially for deep learning-based active learning (DAL), remain scarce.
arXiv Detail & Related papers (2024-05-01T05:54:33Z) - Continual Learning with Pre-Trained Models: A Survey [61.97613090666247]
Continual Learning aims to overcome the catastrophic forgetting of former knowledge when learning new ones.
This paper presents a comprehensive survey of the latest advancements in PTM-based CL.
arXiv Detail & Related papers (2024-01-29T18:27:52Z) - Interpreting Deep Knowledge Tracing Model on EdNet Dataset [67.81797777936868]
In this work, we perform the similar tasks but on a large and newly available dataset, called EdNet.
The preliminary experiment results show the effectiveness of the interpreting techniques.
arXiv Detail & Related papers (2021-10-31T07:18:59Z) - Pre-Trained Models: Past, Present and Future [126.21572378910746]
Large-scale pre-trained models (PTMs) have recently achieved great success and become a milestone in the field of artificial intelligence (AI)
By storing knowledge into huge parameters and fine-tuning on specific tasks, the rich knowledge implicitly encoded in huge parameters can benefit a variety of downstream tasks.
It is now the consensus of the AI community to adopt PTMs as backbone for downstream tasks rather than learning models from scratch.
arXiv Detail & Related papers (2021-06-14T02:40:32Z) - Wizard of Search Engine: Access to Information Through Conversations
with Search Engines [58.53420685514819]
We make efforts to facilitate research on CIS from three aspects.
We formulate a pipeline for CIS with six sub-tasks: intent detection (ID), keyphrase extraction (KE), action prediction (AP), query selection (QS), passage selection (PS) and response generation (RG)
We release a benchmark dataset, called wizard of search engine (WISE), which allows for comprehensive and in-depth research on all aspects of CIS.
arXiv Detail & Related papers (2021-05-18T06:35:36Z) - Improving Knowledge Tracing via Pre-training Question Embeddings [25.611547414936553]
Knowledge tracing (KT) defines the task of predicting whether students can correctly answer questions based on their historical response.
In this paper, we demonstrate that large gains on KT can be realized by pre-training embeddings for each question on abundant side information.
To be specific, the side information includes question difficulty and three kinds of relations contained in a bipartite graph between questions and skills.
arXiv Detail & Related papers (2020-12-09T13:21:23Z) - qDKT: Question-centric Deep Knowledge Tracing [29.431121650577396]
We introduce qDKT, a variant of DKT that models every learner's success probability on individual questions over time.
qDKT incorporates graph Laplacian regularization to smooth predictions under each skill.
Experiments on several real-world datasets show that qDKT achieves state-of-art performance on predicting learner outcomes.
arXiv Detail & Related papers (2020-05-25T23:43:55Z) - Knowledge Distillation and Student-Teacher Learning for Visual
Intelligence: A Review and New Outlooks [39.2907363775529]
Knowledge distillation (KD) has been proposed to transfer information learned from one model to another.
This paper is about KD and S-T learning, which are being actively studied in recent years.
arXiv Detail & Related papers (2020-04-13T13:45:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.