Towards a Unified Framework for Emotion Analysis
- URL: http://arxiv.org/abs/2012.00190v1
- Date: Tue, 1 Dec 2020 00:54:13 GMT
- Title: Towards a Unified Framework for Emotion Analysis
- Authors: Sven Buechel, Luise Modersohn, and Udo Hahn
- Abstract summary: EmoCoder is a modular encoder-decoder architecture that generalizes emotion analysis over different tasks.
EmoCoder learns an interpretable language-independent representation of emotions.
- Score: 12.369106010767283
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present EmoCoder, a modular encoder-decoder architecture that generalizes
emotion analysis over different tasks (sentence-level, word-level,
label-to-label mapping), domains (natural languages and their registers), and
label formats (e.g., polarity classes, basic emotions, and affective
dimensions). Experiments on 14 datasets indicate that EmoCoder learns an
interpretable language-independent representation of emotions, allows seamless
absorption of state-of-the-art models, and maintains strong prediction quality,
even when tested on unseen combinations of domains and label formats.
Related papers
- MEMO-Bench: A Multiple Benchmark for Text-to-Image and Multimodal Large Language Models on Human Emotion Analysis [53.012111671763776]
This study introduces MEMO-Bench, a comprehensive benchmark consisting of 7,145 portraits, each depicting one of six different emotions.
Results demonstrate that existing T2I models are more effective at generating positive emotions than negative ones.
Although MLLMs show a certain degree of effectiveness in distinguishing and recognizing human emotions, they fall short of human-level accuracy.
arXiv Detail & Related papers (2024-11-18T02:09:48Z) - Expansion Quantization Network: An Efficient Micro-emotion Annotation and Detection Framework [2.0209172586699173]
We propose an all-labels and training-set label regression method to map label values to energy intensity levels.
This led to the establishment of the Emotion Quantization Network (EQN) framework for micro-emotion detection and annotation.
The EQN framework is the first to achieve automatic micro-emotion annotation with energy-level scores.
arXiv Detail & Related papers (2024-11-09T12:09:26Z) - Emotion Rendering for Conversational Speech Synthesis with Heterogeneous
Graph-Based Context Modeling [50.99252242917458]
Conversational Speech Synthesis (CSS) aims to accurately express an utterance with the appropriate prosody and emotional inflection within a conversational setting.
To address the issue of data scarcity, we meticulously create emotional labels in terms of category and intensity.
Our model outperforms the baseline models in understanding and rendering emotions.
arXiv Detail & Related papers (2023-12-19T08:47:50Z) - Emotion Embeddings $\unicode{x2014}$ Learning Stable and Homogeneous
Abstractions from Heterogeneous Affective Datasets [4.720033725720261]
We propose a training procedure that learns a shared latent representation for emotions.
Experiments on a wide range of heterogeneous affective datasets indicate that this approach yields the desired interoperability.
arXiv Detail & Related papers (2023-08-15T16:39:10Z) - Leveraging Label Correlations in a Multi-label Setting: A Case Study in
Emotion [0.0]
We exploit label correlations in multi-label emotion recognition models to improve emotion detection.
We demonstrate state-of-the-art performance across Spanish, English, and Arabic in SemEval 2018 Task 1 E-c using monolingual BERT-based models.
arXiv Detail & Related papers (2022-10-28T02:27:18Z) - Chat-Capsule: A Hierarchical Capsule for Dialog-level Emotion Analysis [70.98130990040228]
We propose a Context-based Hierarchical Attention Capsule(Chat-Capsule) model, which models both utterance-level and dialog-level emotions and their interrelations.
On a dialog dataset collected from customer support of an e-commerce platform, our model is also able to predict user satisfaction and emotion curve category.
arXiv Detail & Related papers (2022-03-23T08:04:30Z) - Enhancing Cognitive Models of Emotions with Representation Learning [58.2386408470585]
We present a novel deep learning-based framework to generate embedding representations of fine-grained emotions.
Our framework integrates a contextualized embedding encoder with a multi-head probing model.
Our model is evaluated on the Empathetic Dialogue dataset and shows the state-of-the-art result for classifying 32 emotions.
arXiv Detail & Related papers (2021-04-20T16:55:15Z) - Emotion-Regularized Conditional Variational Autoencoder for Emotional
Response Generation [39.392929591449885]
This paper presents an emotion-regularized conditional variational autoencoder (Emo-CVAE) model for generating emotional conversation responses.
Experimental results show that our Emo-CVAE model can learn a more informative and structured latent space than a conventional CVAE model.
arXiv Detail & Related papers (2021-04-18T13:53:20Z) - EmoGraph: Capturing Emotion Correlations using Graph Networks [71.53159402053392]
We propose EmoGraph that captures the dependencies among different emotions through graph networks.
EmoGraph outperforms strong baselines, especially for macro-F1.
An experiment illustrates the captured emotion correlations can also benefit a single-label classification task.
arXiv Detail & Related papers (2020-08-21T08:59:29Z) - Facial Expression Editing with Continuous Emotion Labels [76.36392210528105]
Deep generative models have achieved impressive results in the field of automated facial expression editing.
We propose a model that can be used to manipulate facial expressions in facial images according to continuous two-dimensional emotion labels.
arXiv Detail & Related papers (2020-06-22T13:03:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.