Cross Domain Emotion Recognition using Few Shot Knowledge Transfer
- URL: http://arxiv.org/abs/2110.05021v1
- Date: Mon, 11 Oct 2021 06:22:18 GMT
- Title: Cross Domain Emotion Recognition using Few Shot Knowledge Transfer
- Authors: Justin Olah, Sabyasachee Baruah, Digbalay Bose, and Shrikanth
Narayanan
- Abstract summary: Few-shot and zero-shot techniques can generalize across unseen emotions by projecting the documents and emotion labels onto a shared embedding space.
In this work, we explore the task of few-shot emotion recognition by transferring the knowledge gained from supervision on the GoEmotions Reddit dataset to the SemEval tweets corpus.
- Score: 21.750633928464026
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Emotion recognition from text is a challenging task due to diverse emotion
taxonomies, lack of reliable labeled data in different domains, and highly
subjective annotation standards. Few-shot and zero-shot techniques can
generalize across unseen emotions by projecting the documents and emotion
labels onto a shared embedding space. In this work, we explore the task of
few-shot emotion recognition by transferring the knowledge gained from
supervision on the GoEmotions Reddit dataset to the SemEval tweets corpus,
using different emotion representation methods. The results show that knowledge
transfer using external knowledge bases and fine-tuned encoders perform
comparably as supervised baselines, requiring minimal supervision from the task
dataset.
Related papers
- Temporal Label Hierachical Network for Compound Emotion Recognition [16.258721361021443]
This article introduces our achievements in the 7th Field Emotion Behavior Analysis (ABAW) competition.
Considering the continuity of emotions over time, we propose a time pyramid structure network for frame level emotion prediction.
At the same time, in order to address the lack of data in composite emotion recognition, we utilize fine-grained labels from the DFEW database.
arXiv Detail & Related papers (2024-07-17T19:38:44Z) - Learning Emotion Representations from Verbal and Nonverbal Communication [7.747924294389427]
We present EmotionCLIP, the first pre-training paradigm to extract visual emotion representations from verbal and nonverbal communication.
We guide EmotionCLIP to attend to nonverbal emotion cues through subject-aware context encoding and verbal emotion cues using sentiment-guided contrastive learning.
EmotionCLIP will address the prevailing issue of data scarcity in emotion understanding, thereby fostering progress in related domains.
arXiv Detail & Related papers (2023-05-22T21:36:55Z) - SOLVER: Scene-Object Interrelated Visual Emotion Reasoning Network [83.27291945217424]
We propose a novel Scene-Object interreLated Visual Emotion Reasoning network (SOLVER) to predict emotions from images.
To mine the emotional relationships between distinct objects, we first build up an Emotion Graph based on semantic concepts and visual features.
We also design a Scene-Object Fusion Module to integrate scenes and objects, which exploits scene features to guide the fusion process of object features with the proposed scene-based attention mechanism.
arXiv Detail & Related papers (2021-10-24T02:41:41Z) - Uncovering the Limits of Text-based Emotion Detection [0.0]
We consider the two largest corpora for emotion classification: GoEmotions, with 58k messages labelled by readers, and Vent, with 33M writer-labelled messages.
We design a benchmark and evaluate several feature spaces and learning algorithms, including two simple yet novel models on top of BERT.
arXiv Detail & Related papers (2021-09-04T16:40:06Z) - A Circular-Structured Representation for Visual Emotion Distribution
Learning [82.89776298753661]
We propose a well-grounded circular-structured representation to utilize the prior knowledge for visual emotion distribution learning.
To be specific, we first construct an Emotion Circle to unify any emotional state within it.
On the proposed Emotion Circle, each emotion distribution is represented with an emotion vector, which is defined with three attributes.
arXiv Detail & Related papers (2021-06-23T14:53:27Z) - Acted vs. Improvised: Domain Adaptation for Elicitation Approaches in
Audio-Visual Emotion Recognition [29.916609743097215]
Key challenges in developing generalized automatic emotion recognition systems include scarcity of labeled data and lack of gold-standard references.
In this work, we regard the emotion elicitation approach as domain knowledge, and explore domain transfer learning techniques on emotional utterances.
arXiv Detail & Related papers (2021-04-05T15:59:31Z) - Target Guided Emotion Aware Chat Machine [58.8346820846765]
The consistency of a response to a given post at semantic-level and emotional-level is essential for a dialogue system to deliver human-like interactions.
This article proposes a unifed end-to-end neural architecture, which is capable of simultaneously encoding the semantics and the emotions in a post.
arXiv Detail & Related papers (2020-11-15T01:55:37Z) - COSMIC: COmmonSense knowledge for eMotion Identification in
Conversations [95.71018134363976]
We propose COSMIC, a new framework that incorporates different elements of commonsense such as mental states, events, and causal relations.
We show that COSMIC achieves new state-of-the-art results for emotion recognition on four different benchmark conversational datasets.
arXiv Detail & Related papers (2020-10-06T15:09:38Z) - Emotion Carrier Recognition from Personal Narratives [74.24768079275222]
Personal Narratives (PNs) are recollections of facts, events, and thoughts from one's own experience.
We propose a novel task for Narrative Understanding: Emotion Carrier Recognition (ECR)
arXiv Detail & Related papers (2020-08-17T17:16:08Z) - Meta Transfer Learning for Emotion Recognition [42.61707533351803]
We propose a PathNet-based transfer learning method that is able to transfer emotional knowledge learned from one visual/audio emotion domain to another visual/audio emotion domain.
Our proposed system is capable of improving the performance of emotion recognition, making its performance substantially superior to the recent proposed fine-tuning/pre-trained models based transfer learning methods.
arXiv Detail & Related papers (2020-06-23T00:25:28Z) - Emotion Recognition From Gait Analyses: Current Research and Future
Directions [48.93172413752614]
gait conveys information about the walker's emotion.
The mapping between various emotions and gait patterns provides a new source for automated emotion recognition.
gait is remotely observable, more difficult to imitate, and requires less cooperation from the subject.
arXiv Detail & Related papers (2020-03-13T08:22:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.