Exploring Student Representation For Neural Cognitive Diagnosis
- URL: http://arxiv.org/abs/2111.08951v1
- Date: Wed, 17 Nov 2021 07:47:44 GMT
- Title: Exploring Student Representation For Neural Cognitive Diagnosis
- Authors: Hengyao Bao, Xihua Li, Xuemin Zhao, Yunbo Cao
- Abstract summary: We propose a method of student representation with the exploration of the hierarchical relations of knowledge concepts and student embedding.
Experiments show the effectiveness of proposed representation method.
- Score: 2.8617826964327113
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Cognitive diagnosis, the goal of which is to obtain the proficiency level of
students on specific knowledge concepts, is an fundamental task in smart
educational systems. Previous works usually represent each student as a
trainable knowledge proficiency vector, which cannot capture the relations of
concepts and the basic profile(e.g. memory or comprehension) of students. In
this paper, we propose a method of student representation with the exploration
of the hierarchical relations of knowledge concepts and student embedding.
Specifically, since the proficiency on parent knowledge concepts reflects the
correlation between knowledge concepts, we get the first knowledge proficiency
with a parent-child concepts projection layer. In addition, a low-dimension
dense vector is adopted as the embedding of each student, and obtain the second
knowledge proficiency with a full connection layer. Then, we combine the two
proficiency vector above to get the final representation of students.
Experiments show the effectiveness of proposed representation method.
Related papers
- Disentangling Heterogeneous Knowledge Concept Embedding for Cognitive Diagnosis on Untested Knowledge [24.363775475487117]
We propose a novel framework for Cognitive Diagnosis called Disentangling Heterogeneous Knowledge Cognitive Diagnosis(DisKCD)
We leverage course grades, exercise questions, and learning resources to learn the potential representations of students, exercises, and knowledge concepts.
We construct a heterogeneous relation graph network via students, exercises, tested knowledge concepts(TKCs), and UKCs.
arXiv Detail & Related papers (2024-05-25T01:49:54Z) - Learning Structure and Knowledge Aware Representation with Large Language Models for Concept Recommendation [50.31872005772817]
Concept recommendation aims to suggest the next concept for learners to study based on their knowledge states and the human knowledge system.
Previous approaches have not effectively integrated the human knowledge system into the process of designing these educational models.
We propose a novel Structure and Knowledge Aware Representation learning framework for concept Recommendation (SKarREC)
arXiv Detail & Related papers (2024-05-21T01:35:36Z) - Knowledge Condensation and Reasoning for Knowledge-based VQA [20.808840633377343]
Recent studies retrieve the knowledge passages from external knowledge bases and then use them to answer questions.
We propose two synergistic models: Knowledge Condensation model and Knowledge Reasoning model.
Our method achieves state-of-the-art performance on knowledge-based VQA datasets.
arXiv Detail & Related papers (2024-03-15T06:06:06Z) - Vector-based Representation is the Key: A Study on Disentanglement and
Compositional Generalization [77.57425909520167]
We show that it is possible to achieve both good concept recognition and novel concept composition.
We propose a method to reform the scalar-based disentanglement works to be vector-based to increase both capabilities.
arXiv Detail & Related papers (2023-05-29T13:05:15Z) - Distinguish Before Answer: Generating Contrastive Explanation as
Knowledge for Commonsense Question Answering [61.53454387743701]
We propose CPACE, a concept-centric Prompt-bAsed Contrastive Explanation Generation model.
CPACE converts obtained symbolic knowledge into a contrastive explanation for better distinguishing the differences among given candidates.
We conduct a series of experiments on three widely-used question-answering datasets: CSQA, QASC, and OBQA.
arXiv Detail & Related papers (2023-05-14T12:12:24Z) - COPEN: Probing Conceptual Knowledge in Pre-trained Language Models [60.10147136876669]
Conceptual knowledge is fundamental to human cognition and knowledge bases.
Existing knowledge probing works only focus on factual knowledge of pre-trained language models (PLMs) and ignore conceptual knowledge.
We design three tasks to probe whether PLMs organize entities by conceptual similarities, learn conceptual properties, and conceptualize entities in contexts.
For the tasks, we collect and annotate 24k data instances covering 393 concepts, which is COPEN, a COnceptual knowledge Probing bENchmark.
arXiv Detail & Related papers (2022-11-08T08:18:06Z) - Dual Embodied-Symbolic Concept Representations for Deep Learning [0.8722210937404288]
We advocate the use of a dual-level model for concept representations.
The embodied level consists of concept-oriented feature representations, and the symbolic level consists of concept graphs.
We discuss two important use cases: embodied-symbolic knowledge distillation for few-shot class incremental learning, and embodied-symbolic fused representation for image-text matching.
arXiv Detail & Related papers (2022-03-01T16:40:12Z) - Graph-based Exercise- and Knowledge-Aware Learning Network for Student
Performance Prediction [8.21303828329009]
We propose a Graph-based Exercise- and Knowledge-Aware Learning Network for accurate student score prediction.
We learn students' mastery of exercises and knowledge concepts respectively to model the two-fold effects of exercises and knowledge concepts.
arXiv Detail & Related papers (2021-06-01T06:53:17Z) - HALMA: Humanlike Abstraction Learning Meets Affordance in Rapid Problem
Solving [104.79156980475686]
Humans learn compositional and causal abstraction, ie, knowledge, in response to the structure of naturalistic tasks.
We argue there shall be three levels of generalization in how an agent represents its knowledge: perceptual, conceptual, and algorithmic.
This benchmark is centered around a novel task domain, HALMA, for visual concept development and rapid problem-solving.
arXiv Detail & Related papers (2021-02-22T20:37:01Z) - Multi-level Knowledge Distillation [13.71183256776644]
We introduce Multi-level Knowledge Distillation (MLKD) to transfer richer representational knowledge from teacher to student networks.
MLKD employs three novel teacher-student similarities: individual similarity, relational similarity, and categorical similarity.
Experiments demonstrate that MLKD outperforms other state-of-the-art methods on both similar-architecture and cross-architecture tasks.
arXiv Detail & Related papers (2020-12-01T15:27:15Z) - A Competence-aware Curriculum for Visual Concepts Learning via Question
Answering [95.35905804211698]
We propose a competence-aware curriculum for visual concept learning in a question-answering manner.
We design a neural-symbolic concept learner for learning the visual concepts and a multi-dimensional Item Response Theory (mIRT) model for guiding the learning process.
Experimental results on CLEVR show that with a competence-aware curriculum, the proposed method achieves state-of-the-art performances.
arXiv Detail & Related papers (2020-07-03T05:08:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.