ALEBk: Feasibility Study of Attention Level Estimation via Blink
Detection applied to e-Learning
- URL: http://arxiv.org/abs/2112.09165v1
- Date: Thu, 16 Dec 2021 19:23:56 GMT
- Title: ALEBk: Feasibility Study of Attention Level Estimation via Blink
Detection applied to e-Learning
- Authors: Roberto Daza, Daniel DeAlcala, Aythami Morales, Ruben Tolosana, Ruth
Cobos, Julian Fierrez
- Abstract summary: We experimentally evaluate the relationship between the eye blink rate and the attention level of students captured during online sessions.
Results suggest an inverse correlation between the eye blink frequency and the attention level.
Our results open a new research line to introduce this technology for attention level estimation on future e-learning platforms.
- Score: 6.325464216802613
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This work presents a feasibility study of remote attention level estimation
based on eye blink frequency. We first propose an eye blink detection system
based on Convolutional Neural Networks (CNNs), very competitive with respect to
related works. Using this detector, we experimentally evaluate the relationship
between the eye blink rate and the attention level of students captured during
online sessions. The experimental framework is carried out using a public
multimodal database for eye blink detection and attention level estimation
called mEBAL, which comprises data from 38 students and multiples acquisition
sensors, in particular, i) an electroencephalogram (EEG) band which provides
the time signals coming from the student's cognitive information, and ii) RGB
and NIR cameras to capture the students face gestures. The results achieved
suggest an inverse correlation between the eye blink frequency and the
attention level. This relation is used in our proposed method called ALEBk for
estimating the attention level as the inverse of the eye blink frequency. Our
results open a new research line to introduce this technology for attention
level estimation on future e-learning platforms, among other applications of
this kind of behavioral biometrics based on face analysis.
Related papers
- DeepFace-Attention: Multimodal Face Biometrics for Attention Estimation with Application to e-Learning [18.36413246876648]
This work introduces an innovative method for estimating attention levels (cognitive load) using an ensemble of facial analysis techniques applied to webcam videos.
Our approach adapts state-of-the-art facial analysis technologies to quantify the users' cognitive load in the form of high or low attention.
Our method outperforms existing state-of-the-art accuracies using the public mEBAL2 benchmark.
arXiv Detail & Related papers (2024-08-10T11:39:11Z) - mEBAL2 Database and Benchmark: Image-based Multispectral Eyeblink Detection [14.052943954940758]
This work introduces a new multispectral database and novel approaches for eyeblink detection in RGB and Near-Infrared (NIR) individual images.
mEBAL2 is the largest existing eyeblink database.
mEBAL2 includes 21,100 image sequences from 180 different students.
arXiv Detail & Related papers (2023-09-14T17:25:25Z) - A Deep Learning Approach for the Segmentation of Electroencephalography
Data in Eye Tracking Applications [56.458448869572294]
We introduce DETRtime, a novel framework for time-series segmentation of EEG data.
Our end-to-end deep learning-based framework brings advances in Computer Vision to the forefront.
Our model generalizes well in the task of EEG sleep stage segmentation.
arXiv Detail & Related papers (2022-06-17T10:17:24Z) - Improved Speech Emotion Recognition using Transfer Learning and
Spectrogram Augmentation [56.264157127549446]
Speech emotion recognition (SER) is a challenging task that plays a crucial role in natural human-computer interaction.
One of the main challenges in SER is data scarcity.
We propose a transfer learning strategy combined with spectrogram augmentation.
arXiv Detail & Related papers (2021-08-05T10:39:39Z) - Exploring Visual Engagement Signals for Representation Learning [56.962033268934015]
We present VisE, a weakly supervised learning approach, which maps social images to pseudo labels derived by clustered engagement signals.
We then study how models trained in this way benefit subjective downstream computer vision tasks such as emotion recognition or political bias detection.
arXiv Detail & Related papers (2021-04-15T20:50:40Z) - Dynamic Graph Modeling of Simultaneous EEG and Eye-tracking Data for
Reading Task Identification [79.41619843969347]
We present a new approach, that we call AdaGTCN, for identifying human reader intent from Electroencephalogram(EEG) and Eye movement(EM) data.
Our method, Adaptive Graph Temporal Convolution Network (AdaGTCN), uses an Adaptive Graph Learning Layer and Deep Neighborhood Graph Convolution Layer.
We compare our approach with several baselines to report an improvement of 6.29% on the ZuCo 2.0 dataset, along with extensive ablation experiments.
arXiv Detail & Related papers (2021-02-21T18:19:49Z) - Emotional EEG Classification using Connectivity Features and
Convolutional Neural Networks [81.74442855155843]
We introduce a new classification system that utilizes brain connectivity with a CNN and validate its effectiveness via the emotional video classification.
The level of concentration of the brain connectivity related to the emotional property of the target video is correlated with classification performance.
arXiv Detail & Related papers (2021-01-18T13:28:08Z) - Towards Interaction Detection Using Topological Analysis on Neural
Networks [55.74562391439507]
In neural networks, any interacting features must follow a strongly weighted connection to common hidden units.
We propose a new measure for quantifying interaction strength, based upon the well-received theory of persistent homology.
A Persistence Interaction detection(PID) algorithm is developed to efficiently detect interactions.
arXiv Detail & Related papers (2020-10-25T02:15:24Z) - mEBAL: A Multimodal Database for Eye Blink Detection and Attention Level
Estimation [17.279661852408335]
mEBAL is a multimodal database for eye blink detection and attention level estimation.
It comprises 6,000 samples and the corresponding attention level from 38 different students.
arXiv Detail & Related papers (2020-06-09T15:05:08Z) - Few-Shot Relation Learning with Attention for EEG-based Motor Imagery
Classification [11.873435088539459]
Brain-Computer Interfaces (BCI) based on Electroencephalography (EEG) signals have received a lot of attention.
Motor imagery (MI) data can be used to aid rehabilitation as well as in autonomous driving scenarios.
classification of MI signals is vital for EEG-based BCI systems.
arXiv Detail & Related papers (2020-03-03T02:34:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.