RKT : Relation-Aware Self-Attention for Knowledge Tracing
- URL: http://arxiv.org/abs/2008.12736v1
- Date: Fri, 28 Aug 2020 16:47:03 GMT
- Title: RKT : Relation-Aware Self-Attention for Knowledge Tracing
- Authors: Shalini Pandey, Jaideep Srivastava
- Abstract summary: We propose a novel Relation-aware self-attention model for Knowledge Tracing (RKT)
We introduce a relation-aware self-attention layer that incorporates the contextual information.
Our model outperforms state-of-the-art knowledge tracing methods.
- Score: 2.9778695679660188
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The world has transitioned into a new phase of online learning in response to
the recent Covid19 pandemic. Now more than ever, it has become paramount to
push the limits of online learning in every manner to keep flourishing the
education system. One crucial component of online learning is Knowledge Tracing
(KT). The aim of KT is to model student's knowledge level based on their
answers to a sequence of exercises referred as interactions. Students acquire
their skills while solving exercises and each such interaction has a distinct
impact on student ability to solve a future exercise. This \textit{impact} is
characterized by 1) the relation between exercises involved in the interactions
and 2) student forget behavior. Traditional studies on knowledge tracing do not
explicitly model both the components jointly to estimate the impact of these
interactions. In this paper, we propose a novel Relation-aware self-attention
model for Knowledge Tracing (RKT). We introduce a relation-aware self-attention
layer that incorporates the contextual information. This contextual information
integrates both the exercise relation information through their textual content
as well as student performance data and the forget behavior information through
modeling an exponentially decaying kernel function. Extensive experiments on
three real-world datasets, among which two new collections are released to the
public, show that our model outperforms state-of-the-art knowledge tracing
methods. Furthermore, the interpretable attention weights help visualize the
relation between interactions and temporal patterns in the human learning
process.
Related papers
- CSTA: Spatial-Temporal Causal Adaptive Learning for Exemplar-Free Video Class-Incremental Learning [62.69917996026769]
A class-incremental learning task requires learning and preserving both spatial appearance and temporal action involvement.
We propose a framework that equips separate adapters to learn new class patterns, accommodating the incremental information requirements unique to each class.
A causal compensation mechanism is proposed to reduce the conflicts during increment and memorization for between different types of information.
arXiv Detail & Related papers (2025-01-13T11:34:55Z) - Learning states enhanced knowledge tracing: Simulating the diversity in real-world learning process [16.472568558398482]
The Knowledge Tracing task focuses on predicting a learner's future performance based on the historical interactions.
We propose a new method named Learning State Enhanced Knowledge Tracing (LSKT)
Experimental results on four real-world datasets show that our LSKT method outperforms the current state-of-the-art methods.
arXiv Detail & Related papers (2024-12-27T09:41:25Z) - DyGKT: Dynamic Graph Learning for Knowledge Tracing [27.886870568131254]
This work is motivated by three dynamical characteristics: 1) The scales of students answering records are constantly growing; 2) The semantics of time intervals between the records vary; 3) The relationships between students, questions and concepts are evolving.
Along this line, we propose a Dynamic Graph-based Knowledge Tracing model, namely DyGKT.
In particular, a continuous-time dynamic question-answering graph for knowledge tracing is constructed to deal with the infinitely growing answering behaviors.
arXiv Detail & Related papers (2024-07-30T13:43:32Z) - SINKT: A Structure-Aware Inductive Knowledge Tracing Model with Large Language Model [64.92472567841105]
Knowledge Tracing (KT) aims to determine whether students will respond correctly to the next question.
Structure-aware Inductive Knowledge Tracing model with large language model (dubbed SINKT)
SINKT predicts the student's response to the target question by interacting with the student's knowledge state and the question representation.
arXiv Detail & Related papers (2024-07-01T12:44:52Z) - Recognizing Unseen Objects via Multimodal Intensive Knowledge Graph
Propagation [68.13453771001522]
We propose a multimodal intensive ZSL framework that matches regions of images with corresponding semantic embeddings.
We conduct extensive experiments and evaluate our model on large-scale real-world data.
arXiv Detail & Related papers (2023-06-14T13:07:48Z) - Quiz-based Knowledge Tracing [61.9152637457605]
Knowledge tracing aims to assess individuals' evolving knowledge states according to their learning interactions.
QKT achieves state-of-the-art performance compared to existing methods.
arXiv Detail & Related papers (2023-04-05T12:48:42Z) - A Message Passing Perspective on Learning Dynamics of Contrastive
Learning [60.217972614379065]
We show that if we cast a contrastive objective equivalently into the feature space, then its learning dynamics admits an interpretable form.
This perspective also establishes an intriguing connection between contrastive learning and Message Passing Graph Neural Networks (MP-GNNs)
arXiv Detail & Related papers (2023-03-08T08:27:31Z) - DGEKT: A Dual Graph Ensemble Learning Method for Knowledge Tracing [20.71423236895509]
We present a novel Dual Graph Ensemble learning method for Knowledge Tracing (DGEKT)
DGEKT establishes a dual graph structure of students' learning interactions to capture the heterogeneous exercise-concept associations.
Online knowledge distillation provides its predictions on all exercises as extra supervision for better modeling ability.
arXiv Detail & Related papers (2022-11-23T11:37:35Z) - Deep Graph Memory Networks for Forgetting-Robust Knowledge Tracing [5.648636668261282]
We propose a novel knowledge tracing model, namely emphDeep Graph Memory Network (DGMN)
In this model, we incorporate a forget gating mechanism into an attention memory structure in order to capture forgetting behaviours.
This model has the capability of learning relationships between latent concepts from a dynamic latent concept graph.
arXiv Detail & Related papers (2021-08-18T12:04:10Z) - GIKT: A Graph-based Interaction Model for Knowledge Tracing [36.07642261246016]
We propose a Graph-based Interaction model for Knowledge Tracing (GIKT) to tackle the above probems.
More specifically, GIKT utilizes graph convolutional network (GCN) to substantially incorporate question-skill correlations.
Experiments on three datasets demonstrate that GIKT achieves the new state-of-the-art performance, with at least 1% absolute AUC improvement.
arXiv Detail & Related papers (2020-09-13T12:50:32Z) - Generative Adversarial Zero-Shot Relational Learning for Knowledge
Graphs [96.73259297063619]
We consider a novel formulation, zero-shot learning, to free this cumbersome curation.
For newly-added relations, we attempt to learn their semantic features from their text descriptions.
We leverage Generative Adrial Networks (GANs) to establish the connection between text and knowledge graph domain.
arXiv Detail & Related papers (2020-01-08T01:19:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.