Leveraging Emotion-specific Features to Improve Transformer Performance
for Emotion Classification
- URL: http://arxiv.org/abs/2205.00283v1
- Date: Sat, 30 Apr 2022 14:36:04 GMT
- Title: Leveraging Emotion-specific Features to Improve Transformer Performance
for Emotion Classification
- Authors: Shaily Desai, Atharva Kshirsagar, Aditi Sidnerlikar, Nikhil Khodake,
Manisha Marathe
- Abstract summary: This paper describes the approach to the Emotion Classification shared task held at WASSA 2022 by team PVGs AI Club.
This Track 2 sub-task focuses on building models which can predict a multi-class emotion label based on essays from news articles.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper describes the approach to the Emotion Classification shared task
held at WASSA 2022 by team PVGs AI Club. This Track 2 sub-task focuses on
building models which can predict a multi-class emotion label based on essays
from news articles where a person, group or another entity is affected.
Baseline transformer models have been demonstrating good results on sequence
classification tasks, and we aim to improve this performance with the help of
ensembling techniques, and by leveraging two variations of emotion-specific
representations. We observe better results than our baseline models and achieve
an accuracy of 0.619 and a macro F1 score of 0.520 on the emotion
classification task.
Related papers
- Self-Training with Pseudo-Label Scorer for Aspect Sentiment Quad Prediction [54.23208041792073]
Aspect Sentiment Quad Prediction (ASQP) aims to predict all quads (aspect term, aspect category, opinion term, sentiment polarity) for a given review.
A key challenge in the ASQP task is the scarcity of labeled data, which limits the performance of existing methods.
We propose a self-training framework with a pseudo-label scorer, wherein a scorer assesses the match between reviews and their pseudo-labels.
arXiv Detail & Related papers (2024-06-26T05:30:21Z) - Transformer based neural networks for emotion recognition in conversations [4.915541242112533]
Paper outlines approach of the ISDS-NLP team in SemEval 2024 Task 10: Emotion Discovery and Reasoning its Flip in Conversation (EDiReF).
arXiv Detail & Related papers (2024-05-18T08:05:05Z) - PetKaz at SemEval-2024 Task 3: Advancing Emotion Classification with an LLM for Emotion-Cause Pair Extraction in Conversations [4.463184061618504]
We present our submission to the SemEval-2023 Task3 "The Competition of Multimodal Emotion Cause Analysis in Conversations"
Our approach relies on combining fine-tuned GPT-3.5 for emotion classification and a BiLSTM-based neural network to detect causes.
arXiv Detail & Related papers (2024-04-08T13:25:03Z) - Multimodal Feature Extraction and Fusion for Emotional Reaction
Intensity Estimation and Expression Classification in Videos with
Transformers [47.16005553291036]
We present our solutions to the two sub-challenges of Affective Behavior Analysis in the wild (ABAW) 2023.
For the Expression Classification Challenge, we propose a streamlined approach that handles the challenges of classification effectively.
By studying, analyzing, and combining these features, we significantly enhance the model's accuracy for sentiment prediction in a multimodal context.
arXiv Detail & Related papers (2023-03-16T09:03:17Z) - Bag of Tricks for Effective Language Model Pretraining and Downstream
Adaptation: A Case Study on GLUE [93.98660272309974]
This report briefly describes our submission Vega v1 on the General Language Understanding Evaluation leaderboard.
GLUE is a collection of nine natural language understanding tasks, including question answering, linguistic acceptability, sentiment analysis, text similarity, paraphrase detection, and natural language inference.
With our optimized pretraining and fine-tuning strategies, our 1.3 billion model sets new state-of-the-art on 4/9 tasks, achieving the best average score of 91.3.
arXiv Detail & Related papers (2023-02-18T09:26:35Z) - Improving the Generalizability of Text-Based Emotion Detection by
Leveraging Transformers with Psycholinguistic Features [27.799032561722893]
We propose approaches for text-based emotion detection that leverage transformer models (BERT and RoBERTa) in combination with Bidirectional Long Short-Term Memory (BiLSTM) networks trained on a comprehensive set of psycholinguistic features.
We find that the proposed hybrid models improve the ability to generalize to out-of-distribution data compared to a standard transformer-based approach.
arXiv Detail & Related papers (2022-12-19T13:58:48Z) - Optimize_Prime@DravidianLangTech-ACL2022: Emotion Analysis in Tamil [1.0066310107046081]
This paper aims to perform an emotion analysis of social media comments in Tamil.
The task aimed to classify social media comments into categories of emotion like Joy, Anger, Trust, Disgust, etc.
arXiv Detail & Related papers (2022-04-19T18:47:18Z) - Guiding Generative Language Models for Data Augmentation in Few-Shot
Text Classification [59.698811329287174]
We leverage GPT-2 for generating artificial training instances in order to improve classification performance.
Our results show that fine-tuning GPT-2 in a handful of label instances leads to consistent classification improvements.
arXiv Detail & Related papers (2021-11-17T12:10:03Z) - WASSA@IITK at WASSA 2021: Multi-task Learning and Transformer Finetuning
for Emotion Classification and Empathy Prediction [0.0]
This paper describes our contribution to the WASSA 2021 shared task on Empathy Prediction and Emotion Classification.
The broad goal of this task was to model an empathy score, a distress score and the overall level of emotion of an essay written in response to a newspaper article associated with harm to someone.
We have used the ELECTRA model abundantly and also advanced deep learning approaches like multi-task learning.
arXiv Detail & Related papers (2021-04-20T08:24:10Z) - Modality-Transferable Emotion Embeddings for Low-Resource Multimodal
Emotion Recognition [55.44502358463217]
We propose a modality-transferable model with emotion embeddings to tackle the aforementioned issues.
Our model achieves state-of-the-art performance on most of the emotion categories.
Our model also outperforms existing baselines in the zero-shot and few-shot scenarios for unseen emotions.
arXiv Detail & Related papers (2020-09-21T06:10:39Z) - EmoGraph: Capturing Emotion Correlations using Graph Networks [71.53159402053392]
We propose EmoGraph that captures the dependencies among different emotions through graph networks.
EmoGraph outperforms strong baselines, especially for macro-F1.
An experiment illustrates the captured emotion correlations can also benefit a single-label classification task.
arXiv Detail & Related papers (2020-08-21T08:59:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.