Human Reaction Intensity Estimation with Ensemble of Multi-task Networks
- URL: http://arxiv.org/abs/2303.09240v1
- Date: Thu, 16 Mar 2023 11:35:59 GMT
- Title: Human Reaction Intensity Estimation with Ensemble of Multi-task Networks
- Authors: JiYeon Oh, Daun Kim, Jae-Yeop Jeong, Yeong-Gi Hong, Jin-Woo Jeong
- Abstract summary: "Emotional Reaction Intensity" (ERI) is an important topic in the facial expression recognition task.
We propose a multi-emotional task learning-based approach and present preliminary results for the ERI challenge introduced in the 5th affective behavior analysis in-the-wild (ABAW) competition.
- Score: 2.6432771146480283
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Facial expression in-the-wild is essential for various interactive computing
domains. Especially, "Emotional Reaction Intensity" (ERI) is an important topic
in the facial expression recognition task. In this paper, we propose a
multi-emotional task learning-based approach and present preliminary results
for the ERI challenge introduced in the 5th affective behavior analysis
in-the-wild (ABAW) competition. Our method achieved the mean PCC score of
0.3254.
Related papers
- GPT as Psychologist? Preliminary Evaluations for GPT-4V on Visual Affective Computing [74.68232970965595]
Multimodal large language models (MLLMs) are designed to process and integrate information from multiple sources, such as text, speech, images, and videos.
This paper assesses the application of MLLMs with 5 crucial abilities for affective computing, spanning from visual affective tasks and reasoning tasks.
arXiv Detail & Related papers (2024-03-09T13:56:25Z) - The 6th Affective Behavior Analysis in-the-wild (ABAW) Competition [53.718777420180395]
This paper describes the 6th Affective Behavior Analysis in-the-wild (ABAW) Competition.
The 6th ABAW Competition addresses contemporary challenges in understanding human emotions and behaviors.
arXiv Detail & Related papers (2024-02-29T16:49:38Z) - Multimodal Feature Extraction and Fusion for Emotional Reaction
Intensity Estimation and Expression Classification in Videos with
Transformers [47.16005553291036]
We present our solutions to the two sub-challenges of Affective Behavior Analysis in the wild (ABAW) 2023.
For the Expression Classification Challenge, we propose a streamlined approach that handles the challenges of classification effectively.
By studying, analyzing, and combining these features, we significantly enhance the model's accuracy for sentiment prediction in a multimodal context.
arXiv Detail & Related papers (2023-03-16T09:03:17Z) - ABAW: Valence-Arousal Estimation, Expression Recognition, Action Unit
Detection & Emotional Reaction Intensity Estimation Challenges [62.413819189049946]
5th Affective Behavior Analysis in-the-wild (ABAW) Competition is part of the respective ABAW Workshop which will be held in conjunction with IEEE Computer Vision and Pattern Recognition Conference (CVPR), 2023.
For this year's Competition, we feature two corpora: i) an extended version of the Aff-Wild2 database and ii) the Hume-Reaction dataset.
The latter dataset is an audiovisual one in which reactions of individuals to emotional stimuli have been annotated with respect to seven emotional expression intensities.
arXiv Detail & Related papers (2023-03-02T18:58:15Z) - Multi-task Cross Attention Network in Facial Behavior Analysis [7.910908058662372]
We present our solution for the Multi-Task Learning challenge of the Affective Behavior Analysis in-the-wild competition.
The challenge is a combination of three tasks: action unit detection, facial expression recognition and valance-arousal estimation.
We introduce a cross-attentive module to improve multi-task learning performance.
arXiv Detail & Related papers (2022-07-21T04:07:07Z) - Learning from Synthetic Data: Facial Expression Classification based on
Ensemble of Multi-task Networks [3.736069053271373]
"Learning from Synthetic Data" (LSD) is an important topic in the facial expression recognition task.
We propose a multi-task learning-based facial expression recognition approach.
Our method achieved the mean F1 score of 0.71.
arXiv Detail & Related papers (2022-07-20T16:41:37Z) - Prior Aided Streaming Network for Multi-task Affective Recognitionat the
2nd ABAW2 Competition [9.188777864190204]
We introduce our submission to the 2nd Affective Behavior Analysis in-the-wild (ABAW2) Competition.
In dealing with different emotion representations, we propose a multi-task streaming network.
We leverage an advanced facial expression embedding as prior knowledge.
arXiv Detail & Related papers (2021-07-08T09:35:08Z) - Computational Emotion Analysis From Images: Recent Advances and Future
Directions [79.05003998727103]
In this chapter, we aim to introduce image emotion analysis (IEA) from a computational perspective.
We begin with commonly used emotion representation models from psychology.
We then define the key computational problems that the researchers have been trying to solve.
arXiv Detail & Related papers (2021-03-19T13:33:34Z) - Continuous Emotion Recognition via Deep Convolutional Autoencoder and
Support Vector Regressor [70.2226417364135]
It is crucial that the machine should be able to recognize the emotional state of the user with high accuracy.
Deep neural networks have been used with great success in recognizing emotions.
We present a new model for continuous emotion recognition based on facial expression recognition.
arXiv Detail & Related papers (2020-01-31T17:47:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.