EmoGraph: Capturing Emotion Correlations using Graph Networks
- URL: http://arxiv.org/abs/2008.09378v1
- Date: Fri, 21 Aug 2020 08:59:29 GMT
- Title: EmoGraph: Capturing Emotion Correlations using Graph Networks
- Authors: Peng Xu, Zihan Liu, Genta Indra Winata, Zhaojiang Lin, Pascale Fung
- Abstract summary: We propose EmoGraph that captures the dependencies among different emotions through graph networks.
EmoGraph outperforms strong baselines, especially for macro-F1.
An experiment illustrates the captured emotion correlations can also benefit a single-label classification task.
- Score: 71.53159402053392
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Most emotion recognition methods tackle the emotion understanding task by
considering individual emotion independently while ignoring their fuzziness
nature and the interconnections among them. In this paper, we explore how
emotion correlations can be captured and help different classification tasks.
We propose EmoGraph that captures the dependencies among different emotions
through graph networks. These graphs are constructed by leveraging the
co-occurrence statistics among different emotion categories. Empirical results
on two multi-label classification datasets demonstrate that EmoGraph
outperforms strong baselines, especially for macro-F1. An additional experiment
illustrates the captured emotion correlations can also benefit a single-label
classification task.
Related papers
- Improved Text Emotion Prediction Using Combined Valence and Arousal Ordinal Classification [37.823815777259036]
We introduce a method for categorizing emotions from text, which acknowledges and differentiates between the diversified similarities and distinctions of various emotions.
Our approach not only preserves high accuracy in emotion prediction but also significantly reduces the magnitude of errors in cases of misclassification.
arXiv Detail & Related papers (2024-04-02T10:06:30Z) - Unifying the Discrete and Continuous Emotion labels for Speech Emotion
Recognition [28.881092401807894]
In paralinguistic analysis for emotion detection from speech, emotions have been identified with discrete or dimensional (continuous-valued) labels.
We propose a model to jointly predict continuous and discrete emotional attributes.
arXiv Detail & Related papers (2022-10-29T16:12:31Z) - Seeking Subjectivity in Visual Emotion Distribution Learning [93.96205258496697]
Visual Emotion Analysis (VEA) aims to predict people's emotions towards different visual stimuli.
Existing methods often predict visual emotion distribution in a unified network, neglecting the inherent subjectivity in its crowd voting process.
We propose a novel textitSubjectivity Appraise-and-Match Network (SAMNet) to investigate the subjectivity in visual emotion distribution.
arXiv Detail & Related papers (2022-07-25T02:20:03Z) - Label Distribution Amendment with Emotional Semantic Correlations for
Facial Expression Recognition [69.18918567657757]
We propose a new method that amends the label distribution of each facial image by leveraging correlations among expressions in the semantic space.
By comparing semantic and task class-relation graphs of each image, the confidence of its label distribution is evaluated.
Experimental results demonstrate the proposed method is more effective than compared state-of-the-art methods.
arXiv Detail & Related papers (2021-07-23T07:46:14Z) - A Circular-Structured Representation for Visual Emotion Distribution
Learning [82.89776298753661]
We propose a well-grounded circular-structured representation to utilize the prior knowledge for visual emotion distribution learning.
To be specific, we first construct an Emotion Circle to unify any emotional state within it.
On the proposed Emotion Circle, each emotion distribution is represented with an emotion vector, which is defined with three attributes.
arXiv Detail & Related papers (2021-06-23T14:53:27Z) - Enhancing Cognitive Models of Emotions with Representation Learning [58.2386408470585]
We present a novel deep learning-based framework to generate embedding representations of fine-grained emotions.
Our framework integrates a contextualized embedding encoder with a multi-head probing model.
Our model is evaluated on the Empathetic Dialogue dataset and shows the state-of-the-art result for classifying 32 emotions.
arXiv Detail & Related papers (2021-04-20T16:55:15Z) - Modality-Transferable Emotion Embeddings for Low-Resource Multimodal
Emotion Recognition [55.44502358463217]
We propose a modality-transferable model with emotion embeddings to tackle the aforementioned issues.
Our model achieves state-of-the-art performance on most of the emotion categories.
Our model also outperforms existing baselines in the zero-shot and few-shot scenarios for unseen emotions.
arXiv Detail & Related papers (2020-09-21T06:10:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.