Are Top School Students More Critical of Their Professors? Mining
Comments on RateMyProfessor.com
- URL: http://arxiv.org/abs/2101.12339v1
- Date: Sat, 23 Jan 2021 20:01:36 GMT
- Title: Are Top School Students More Critical of Their Professors? Mining
Comments on RateMyProfessor.com
- Authors: Ziqi Tang, Yutong Wang, Jiebo Luo
- Abstract summary: Student reviews and comments on RateMyProfessor.com reflect realistic learning experiences of students.
Our study proves that student reviews and comments contain crucial information and can serve as essential references for enrollment in courses and universities.
- Score: 83.2634062100579
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Student reviews and comments on RateMyProfessor.com reflect realistic
learning experiences of students. Such information provides a large-scale data
source to examine the teaching quality of the lecturers. In this paper, we
propose an in-depth analysis of these comments. First, we partition our data
into different comparison groups. Next, we perform exploratory data analysis to
delve into the data. Furthermore, we employ Latent Dirichlet Allocation and
sentiment analysis to extract topics and understand the sentiments associated
with the comments. We uncover interesting insights about the characteristics of
both college students and professors. Our study proves that student reviews and
comments contain crucial information and can serve as essential references for
enrollment in courses and universities.
Related papers
- LLMs Assist NLP Researchers: Critique Paper (Meta-)Reviewing [106.45895712717612]
Large language models (LLMs) have shown remarkable versatility in various generative tasks.
This study focuses on the topic of LLMs assist NLP Researchers.
To our knowledge, this is the first work to provide such a comprehensive analysis.
arXiv Detail & Related papers (2024-06-24T01:30:22Z) - A Literature Review of Literature Reviews in Pattern Analysis and Machine Intelligence [58.6354685593418]
This paper proposes several article-level, field-normalized, and large language model-empowered bibliometric indicators to evaluate reviews.
The newly emerging AI-generated literature reviews are also appraised.
This work offers insights into the current challenges of literature reviews and envisions future directions for their development.
arXiv Detail & Related papers (2024-02-20T11:28:50Z) - Summative Student Course Review Tool Based on Machine Learning Sentiment
Analysis to Enhance Life Science Feedback Efficacy [4.518390136757588]
We show a novel approach to summarizing and organizing students' opinions via analyzing their sentiment towards a course as a function of the language/vocabulary used.
This analysis is derived from their responses to a general comment section encountered at the end of post-course review surveys.
arXiv Detail & Related papers (2023-01-15T19:56:56Z) - SETSum: Summarization and Visualization of Student Evaluations of
Teaching [74.76373136325032]
Student Evaluations of Teaching (SETs) are widely used in colleges and universities.
SETSum provides organized illustrations of SET findings to instructors and other reviewers.
arXiv Detail & Related papers (2022-07-08T01:40:11Z) - Learning Opinion Summarizers by Selecting Informative Reviews [81.47506952645564]
We collect a large dataset of summaries paired with user reviews for over 31,000 products, enabling supervised training.
The content of many reviews is not reflected in the human-written summaries, and, thus, the summarizer trained on random review subsets hallucinates.
We formulate the task as jointly learning to select informative subsets of reviews and summarizing the opinions expressed in these subsets.
arXiv Detail & Related papers (2021-09-09T15:01:43Z) - A literature survey on student feedback assessment tools and their usage
in sentiment analysis [0.0]
We evaluate the effectiveness of various in-class feedback assessment methods such as Kahoot!, Mentimeter, Padlet, and polling.
We propose a sentiment analysis model for extracting the explicit suggestions from the students' qualitative feedback comments.
arXiv Detail & Related papers (2021-09-09T06:56:30Z) - Hocalarim: Mining Turkish Student Reviews [0.0]
We introduce Hocalarim (MyProfessors), the largest student review dataset available for the Turkish language.
It consists of over 5000 professor reviews left online by students, with different aspects of education rated on a scale of 1 to 5 stars.
We investigate the properties of the dataset and present its statistics.
arXiv Detail & Related papers (2021-09-06T09:55:58Z) - Polarity in the Classroom: A Case Study Leveraging Peer Sentiment Toward
Scalable Assessment [4.588028371034406]
Accurately grading open-ended assignments in large or massive open online courses (MOOCs) is non-trivial.
In this work, we detail the process by which we create our domain-dependent lexicon and aspect-informed review form.
We end by analyzing validity and discussing conclusions from our corpus of over 6800 peer reviews from nine courses.
arXiv Detail & Related papers (2021-08-02T15:45:11Z) - Weakly-Supervised Aspect-Based Sentiment Analysis via Joint
Aspect-Sentiment Topic Embedding [71.2260967797055]
We propose a weakly-supervised approach for aspect-based sentiment analysis.
We learn sentiment, aspect> joint topic embeddings in the word embedding space.
We then use neural models to generalize the word-level discriminative information.
arXiv Detail & Related papers (2020-10-13T21:33:24Z) - Aspect-based Sentiment Analysis of Scientific Reviews [12.472629584751509]
We show that the distribution of aspect-based sentiments obtained from a review is significantly different for accepted and rejected papers.
As a second objective, we quantify the extent of disagreement among the reviewers refereeing a paper.
We also investigate the extent of disagreement between the reviewers and the chair and find that the inter-reviewer disagreement may have a link to the disagreement with the chair.
arXiv Detail & Related papers (2020-06-05T07:06:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.