Leaf: Multiple-Choice Question Generation
- URL: http://arxiv.org/abs/2201.09012v1
- Date: Sat, 22 Jan 2022 10:17:53 GMT
- Title: Leaf: Multiple-Choice Question Generation
- Authors: Kristiyan Vachev, Momchil Hardalov, Georgi Karadzhov, Georgi Georgiev,
Ivan Koychev, Preslav Nakov
- Abstract summary: We present Leaf, a system for generating multiple-choice questions from factual text.
In addition to being very well suited for the classroom, Leaf could also be used in an industrial setting.
- Score: 19.910992586616477
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Testing with quiz questions has proven to be an effective way to assess and
improve the educational process. However, manually creating quizzes is tedious
and time-consuming. To address this challenge, we present Leaf, a system for
generating multiple-choice questions from factual text. In addition to being
very well suited for the classroom, Leaf could also be used in an industrial
setting, e.g., to facilitate onboarding and knowledge sharing, or as a
component of chatbots, question answering systems, or Massive Open Online
Courses (MOOCs). The code and the demo are available on
https://github.com/KristiyanVachev/Leaf-Question-Generation.
Related papers
- How Teachers Can Use Large Language Models and Bloom's Taxonomy to
Create Educational Quizzes [5.487297537295827]
This paper applies a large language model-based QG approach where questions are generated with learning goals derived from Bloom's taxonomy.
The results demonstrate that teachers prefer to write quizzes with automatically generated questions, and that such quizzes have no loss in quality compared to handwritten versions.
arXiv Detail & Related papers (2024-01-11T13:47:13Z) - Improving Question Generation with Multi-level Content Planning [70.37285816596527]
This paper addresses the problem of generating questions from a given context and an answer, specifically focusing on questions that require multi-hop reasoning across an extended context.
We propose MultiFactor, a novel QG framework based on multi-level content planning. Specifically, MultiFactor includes two components: FA-model, which simultaneously selects key phrases and generates full answers, and Q-model which takes the generated full answer as an additional input to generate questions.
arXiv Detail & Related papers (2023-10-20T13:57:01Z) - Mask and Cloze: Automatic Open Cloze Question Generation using a Masked
Language Model [0.0]
In spite of its benefits, the open cloze test has been introduced only sporadically on the educational front.
We developed CLOZER, an automatic open cloze question generator.
A comparative experiment with human-generated questions also reveals that CLOZER can generate OCQs better than the average non-native English teacher.
arXiv Detail & Related papers (2022-05-15T07:03:09Z) - Educational Question Generation of Children Storybooks via Question Type Distribution Learning and Event-Centric Summarization [67.1483219601714]
We propose a novel question generation method that first learns the question type distribution of an input story paragraph.
We finetune a pre-trained transformer-based sequence-to-sequence model using silver samples composed by educational question-answer pairs.
Our work indicates the necessity of decomposing question type distribution learning and event-centric summary generation for educational question generation.
arXiv Detail & Related papers (2022-03-27T02:21:19Z) - Continuous Examination by Automatic Quiz Assessment Using Spiral Codes
and Image Processing [69.35569554213679]
Paper quizzes are affordable and within reach of campus education in classrooms.
correction of the quiz is a considerable obstacle.
We suggest mitigating the issue by a novel image processing technique.
arXiv Detail & Related papers (2022-01-26T22:58:15Z) - MixQG: Neural Question Generation with Mixed Answer Types [54.23205265351248]
We propose a neural question generator, MixQG, to bridge this gap.
We combine 9 question answering datasets with diverse answer types, including yes/no, multiple-choice, extractive, and abstractive answers.
Our model outperforms existing work in both seen and unseen domains.
arXiv Detail & Related papers (2021-10-15T16:03:40Z) - Generating Answer Candidates for Quizzes and Answer-Aware Question
Generators [16.44011627249311]
We propose a model that can generate a specified number of answer candidates for a given passage of text.
Our experiments show that our proposed answer candidate generation model outperforms several baselines.
arXiv Detail & Related papers (2021-08-29T19:33:51Z) - Dive into Deep Learning [119.30375933463156]
The book is drafted in Jupyter notebooks, seamlessly integrating exposition figures, math, and interactive examples with self-contained code.
Our goal is to offer a resource that could (i) be freely available for everyone; (ii) offer sufficient technical depth to provide a starting point on the path to becoming an applied machine learning scientist; (iii) include runnable code, showing readers how to solve problems in practice; (iv) allow for rapid updates, both by us and also by the community at large.
arXiv Detail & Related papers (2021-06-21T18:19:46Z) - Real-Time Cognitive Evaluation of Online Learners through Automatically
Generated Questions [0.0]
The paper presents an approach to generate questions from a given video lecture automatically.
The generated questions are aimed to evaluate learners' lower-level cognitive abilities.
arXiv Detail & Related papers (2021-06-06T05:45:56Z) - Retrieve, Program, Repeat: Complex Knowledge Base Question Answering via
Alternate Meta-learning [56.771557756836906]
We present a novel method that automatically learns a retrieval model alternately with the programmer from weak supervision.
Our system leads to state-of-the-art performance on a large-scale task for complex question answering over knowledge bases.
arXiv Detail & Related papers (2020-10-29T18:28:16Z) - Procedural Generation of STEM Quizzes [0.0]
We argue that procedural question generation greatly facilitates the task of creating varied, formative, up-to-date, adaptive question banks for STEM quizzes.
We present and evaluate a proof-of-concept Python API for script-based question generation.
A side advantage of our system is that the question bank is actually embedded in Python code, making collaboration, version control, and maintenance tasks very easy.
arXiv Detail & Related papers (2020-09-08T17:15:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.