Knowledge Relation Rank Enhanced Heterogeneous Learning Interaction
Modeling for Neural Graph Forgetting Knowledge Tracing
- URL: http://arxiv.org/abs/2304.03945v1
- Date: Sat, 8 Apr 2023 07:29:53 GMT
- Title: Knowledge Relation Rank Enhanced Heterogeneous Learning Interaction
Modeling for Neural Graph Forgetting Knowledge Tracing
- Authors: Linqing Li, Zhifeng Wang
- Abstract summary: knowledge tracing models have been applied in educational data mining.
Knowledge Relation Rank Enhanced Heterogeneous Learning Interaction Modeling for Neural Graph Forgetting Knowledge Tracing(NGFKT) is proposed.
Experiments are conducted on the two public educational datasets and results indicate that the NGFKT model outperforms all baseline models in terms of AUC, ACC, and Performance Stability(PS)
- Score: 1.0152838128195467
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, knowledge tracing models have been applied in educational data
mining such as the Self-attention knowledge tracing model(SAKT), which models
the relationship between exercises and Knowledge concepts(Kcs). However,
relation modeling in traditional Knowledge tracing models only considers the
static question-knowledge relationship and knowledge-knowledge relationship and
treats these relationships with equal importance. This kind of relation
modeling is difficult to avoid the influence of subjective labeling and
considers the relationship between exercises and KCs, or KCs and KCs
separately. In this work, a novel knowledge tracing model, named Knowledge
Relation Rank Enhanced Heterogeneous Learning Interaction Modeling for Neural
Graph Forgetting Knowledge Tracing(NGFKT), is proposed to reduce the impact of
the subjective labeling by calibrating the skill relation matrix and the
Q-matrix and apply the Graph Convolutional Network(GCN) to model the
heterogeneous interactions between students, exercises, and skills.
Specifically, the skill relation matrix and Q-matrix are generated by the
Knowledge Relation Importance Rank Calibration method(KRIRC). Then the
calibrated skill relation matrix, Q-matrix, and the heterogeneous interactions
are treated as the input of the GCN to generate the exercise embedding and
skill embedding. Next, the exercise embedding, skill embedding, item
difficulty, and contingency table are incorporated to generate an exercise
relation matrix as the inputs of the Position-Relation-Forgetting attention
mechanism. Finally, the Position-Relation-Forgetting attention mechanism is
applied to make the predictions. Experiments are conducted on the two public
educational datasets and results indicate that the NGFKT model outperforms all
baseline models in terms of AUC, ACC, and Performance Stability(PS).
Related papers
- SINKT: A Structure-Aware Inductive Knowledge Tracing Model with Large Language Model [64.92472567841105]
Knowledge Tracing (KT) aims to determine whether students will respond correctly to the next question.
Structure-aware Inductive Knowledge Tracing model with large language model (dubbed SINKT)
SINKT predicts the student's response to the target question by interacting with the student's knowledge state and the question representation.
arXiv Detail & Related papers (2024-07-01T12:44:52Z) - Modeling Balanced Explicit and Implicit Relations with Contrastive
Learning for Knowledge Concept Recommendation in MOOCs [1.0377683220196874]
Existing methods rely on the explicit relations between users and knowledge concepts for recommendation.
There are numerous implicit relations generated within the users' learning activities on the MOOC platforms.
We propose a novel framework based on contrastive learning, which can represent and balance the explicit and implicit relations.
arXiv Detail & Related papers (2024-02-13T07:12:44Z) - Learning Complete Topology-Aware Correlations Between Relations for Inductive Link Prediction [121.65152276851619]
We show that semantic correlations between relations are inherently edge-level and entity-independent.
We propose a novel subgraph-based method, namely TACO, to model Topology-Aware COrrelations between relations.
To further exploit the potential of RCN, we propose Complete Common Neighbor induced subgraph.
arXiv Detail & Related papers (2023-09-20T08:11:58Z) - Message Intercommunication for Inductive Relation Reasoning [49.731293143079455]
We develop a novel inductive relation reasoning model called MINES.
We introduce a Message Intercommunication mechanism on the Neighbor-Enhanced Subgraph.
Our experiments show that MINES outperforms existing state-of-the-art models.
arXiv Detail & Related papers (2023-05-23T13:51:46Z) - Knowledge Graph Completion with Counterfactual Augmentation [23.20561746976504]
We introduce a counterfactual question: "would the relation still exist if the neighborhood of entities became different from observation?"
With a carefully designed instantiation of a causal model on the knowledge graph, we generate the counterfactual relations to answer the question.
We incorporate the created counterfactual relations with the GNN-based framework on KGs to augment their learning of entity pair representations.
arXiv Detail & Related papers (2023-02-25T14:08:15Z) - ProjB: An Improved Bilinear Biased ProjE model for Knowledge Graph
Completion [1.5576879053213302]
This work improves on ProjE KGE due to low computational complexity and high potential for model improvement.
Experimental results on benchmark Knowledge Graphs (KGs) such as FB15K and WN18 show that the proposed approach outperforms the state-of-the-art models in entity prediction task.
arXiv Detail & Related papers (2022-08-15T18:18:05Z) - Jointly Learning Knowledge Embedding and Neighborhood Consensus with
Relational Knowledge Distillation for Entity Alignment [9.701081498310165]
Entity alignment aims at integrating heterogeneous knowledge from different knowledge graphs.
Recent studies employ embedding-based methods by first learning representation of Knowledge Graphs and then performing entity alignment.
We propose a Graph Convolutional Network (GCN) model equipped with knowledge distillation for entity alignment.
arXiv Detail & Related papers (2022-01-25T02:47:14Z) - A Probit Tensor Factorization Model For Relational Learning [31.613211987639296]
We propose a binary tensor factorization model with probit link, which inherits the computation efficiency from the classic tensor factorization model.
Our proposed probit tensor factorization (PTF) model shows advantages in both the prediction accuracy and interpretability.
arXiv Detail & Related papers (2021-11-06T19:23:07Z) - RelWalk A Latent Variable Model Approach to Knowledge Graph Embedding [50.010601631982425]
This paper extends the random walk model (Arora et al., 2016a) of word embeddings to Knowledge Graph Embeddings (KGEs)
We derive a scoring function that evaluates the strength of a relation R between two entities h (head) and t (tail)
We propose a learning objective motivated by the theoretical analysis to learn KGEs from a given knowledge graph.
arXiv Detail & Related papers (2021-01-25T13:31:29Z) - Learning to Decouple Relations: Few-Shot Relation Classification with
Entity-Guided Attention and Confusion-Aware Training [49.9995628166064]
We propose CTEG, a model equipped with two mechanisms to learn to decouple easily-confused relations.
On the one hand, an EGA mechanism is introduced to guide the attention to filter out information causing confusion.
On the other hand, a Confusion-Aware Training (CAT) method is proposed to explicitly learn to distinguish relations.
arXiv Detail & Related papers (2020-10-21T11:07:53Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.