Keeping Teams in the Game: Predicting Dropouts in Online Problem-Based
Learning Competition
- URL: http://arxiv.org/abs/2312.16362v1
- Date: Wed, 27 Dec 2023 00:08:19 GMT
- Title: Keeping Teams in the Game: Predicting Dropouts in Online Problem-Based
Learning Competition
- Authors: Aditya Panwar, Ashwin T S, Ramkumar Rajendran, Kavi Arya
- Abstract summary: The study employs an online longitudinal problem-based learning (PBL) collaborative robotics competition as the testbed.
The study aims to predict dropout behavior via the contributions of Discourse discussion forum 'activities' of participating teams, along with a self-reported Online Learning Strategies Questionnaire (OSLQ)
The findings demonstrate the reliability of OSLQ with our substantial sample size and reveal promising results for predicting the dropout rate in online competition.
- Score: 0.4779196219827506
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Online learning and MOOCs have become increasingly popular in recent years,
and the trend will continue, given the technology boom. There is a dire need to
observe learners' behavior in these online courses, similar to what instructors
do in a face-to-face classroom. Learners' strategies and activities become
crucial to understanding their behavior. One major challenge in online courses
is predicting and preventing dropout behavior. While several studies have tried
to perform such analysis, there is still a shortage of studies that employ
different data streams to understand and predict the drop rates. Moreover,
studies rarely use a fully online team-based collaborative environment as their
context. Thus, the current study employs an online longitudinal problem-based
learning (PBL) collaborative robotics competition as the testbed. Through
methodological triangulation, the study aims to predict dropout behavior via
the contributions of Discourse discussion forum 'activities' of participating
teams, along with a self-reported Online Learning Strategies Questionnaire
(OSLQ). The study also uses Qualitative interviews to enhance the ground truth
and results. The OSLQ data is collected from more than 4000 participants.
Furthermore, the study seeks to establish the reliability of OSLQ to advance
research within online environments. Various Machine Learning algorithms are
applied to analyze the data. The findings demonstrate the reliability of OSLQ
with our substantial sample size and reveal promising results for predicting
the dropout rate in online competition.
Related papers
- What Makes CLIP More Robust to Long-Tailed Pre-Training Data? A Controlled Study for Transferable Insights [67.72413262980272]
Severe data imbalance naturally exists among web-scale vision-language datasets.
We find CLIP pre-trained thereupon exhibits notable robustness to the data imbalance compared to supervised learning.
The robustness and discriminability of CLIP improve with more descriptive language supervision, larger data scale, and broader open-world concepts.
arXiv Detail & Related papers (2024-05-31T17:57:24Z) - How does online teamwork change student communication patterns in
programming courses? [0.0]
Recent studies have shown that peer communication positively affects learning outcomes of online teaching.
In this study, we compare communication patterns in MOOCs where peer communication is limited with those of a blended course in which students are involved in online peer instruction.
arXiv Detail & Related papers (2022-04-08T18:34:52Z) - Online Continual Learning with Natural Distribution Shifts: An Empirical
Study with Visual Data [101.6195176510611]
"Online" continual learning enables evaluating both information retention and online learning efficacy.
In online continual learning, each incoming small batch of data is first used for testing and then added to the training set, making the problem truly online.
We introduce a new benchmark for online continual visual learning that exhibits large scale and natural distribution shifts.
arXiv Detail & Related papers (2021-08-20T06:17:20Z) - Comparative Study of Learning Outcomes for Online Learning Platforms [47.5164159412965]
Personalization and active learning are key aspects to successful learning.
We run a comparative head-to-head study of learning outcomes for two popular online learning platforms.
arXiv Detail & Related papers (2021-04-15T20:40:24Z) - Social Engagement versus Learning Engagement -- An Exploratory Study of
FutureLearn Learners [61.58283466715385]
Massive Open Online Courses (MOOCs) continue to see increasing enrolment, but only a small percent of enrolees completes the MOOCs.
This study is particularly concerned with how learners interact with peers, along with their study progression in MOOCs.
The study was conducted on the less explored FutureLearn platform, which employs a social constructivist approach and promotes collaborative learning.
arXiv Detail & Related papers (2020-08-11T16:09:10Z) - Peer-inspired Student Performance Prediction in Interactive Online
Question Pools with Graph Neural Network [56.62345811216183]
We propose a novel approach using Graph Neural Networks (GNNs) to achieve better student performance prediction in interactive online question pools.
Specifically, we model the relationship between students and questions using student interactions to construct the student-interaction-question network.
We evaluate the effectiveness of our approach on a real-world dataset consisting of 104,113 mouse trajectories generated in the problem-solving process of over 4000 students on 1631 questions.
arXiv Detail & Related papers (2020-08-04T14:55:32Z) - EPARS: Early Prediction of At-risk Students with Online and Offline
Learning Behaviors [55.33024245762306]
Early prediction of students at risk (STAR) is an effective and significant means to provide timely intervention for dropout and suicide.
Existing works mostly rely on either online or offline learning behaviors which are not comprehensive enough to capture the whole learning processes.
We propose a novel algorithm (EPARS) that could early predict STAR in a semester by modeling online and offline learning behaviors.
arXiv Detail & Related papers (2020-06-06T12:56:26Z) - Identifying At-Risk K-12 Students in Multimodal Online Environments: A
Machine Learning Approach [23.02984017971824]
It is crucial to have a dropout warning framework to preemptively identify K-12 students who are at risk of dropping out of the online courses.
We develop a machine learning framework to conduct accurate at-risk student identification specialized in K-12 multimodal online environments.
arXiv Detail & Related papers (2020-03-21T14:34:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.