Visual Simplified Characters' Emotion Emulator Implementing OCC Model
- URL: http://arxiv.org/abs/2001.06190v1
- Date: Fri, 17 Jan 2020 08:41:46 GMT
- Title: Visual Simplified Characters' Emotion Emulator Implementing OCC Model
- Authors: Ana Lilia Laureano-Cruces, Laura Hern\'andez-Dom\'inguez, Martha
Mora-Torres, Juan-Manuel Torres-Moreno, Jaime Enrique Cabrera-L\'opez
- Abstract summary: This paper is based on a simplified view of the cognitive structure of emotions proposed by Ortony, Clore and Collins (OCC Model)
The goal of this paper is to provide a visual platform that allows us to observe changes in the characters different emotions.
This tool was tested on stories with a contrasting variety of emotional and affective environments: Othello, Twilight, and Harry Potter, behaving sensibly and in keeping the atmosphere in which the characters were immersed.
- Score: 0.9868246135032442
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we present a visual emulator of the emotions seen in
characters in stories. This system is based on a simplified view of the
cognitive structure of emotions proposed by Ortony, Clore and Collins (OCC
Model). The goal of this paper is to provide a visual platform that allows us
to observe changes in the characters' different emotions, and the intricate
interrelationships between: 1) each character's emotions, 2) their affective
relationships and actions, 3) The events that take place in the development of
a plot, and 4) the objects of desire that make up the emotional map of any
story. This tool was tested on stories with a contrasting variety of emotional
and affective environments: Othello, Twilight, and Harry Potter, behaving
sensibly and in keeping with the atmosphere in which the characters were
immersed.
Related papers
- Personality-affected Emotion Generation in Dialog Systems [67.40609683389947]
We propose a new task, Personality-affected Emotion Generation, to generate emotion based on the personality given to the dialog system.
We analyze the challenges in this task, i.e., (1) heterogeneously integrating personality and emotional factors and (2) extracting multi-granularity emotional information in the dialog context.
Results suggest that by adopting our method, the emotion generation performance is improved by 13% in macro-F1 and 5% in weighted-F1 from the BERT-base model.
arXiv Detail & Related papers (2024-04-03T08:48:50Z) - Daisy-TTS: Simulating Wider Spectrum of Emotions via Prosody Embedding Decomposition [12.605375307094416]
We propose an emotional text-to-speech design to simulate a wider spectrum of emotions grounded on the structural model.
Our proposed design, Daisy-TTS, incorporates a prosody encoder to learn emotionally-separable prosody embedding as a proxy for emotion.
arXiv Detail & Related papers (2024-02-22T13:15:49Z) - Language Models (Mostly) Do Not Consider Emotion Triggers When Predicting Emotion [87.18073195745914]
We investigate how well human-annotated emotion triggers correlate with features deemed salient in their prediction of emotions.
Using EmoTrigger, we evaluate the ability of large language models to identify emotion triggers.
Our analysis reveals that emotion triggers are largely not considered salient features for emotion prediction models, instead there is intricate interplay between various features and the task of emotion detection.
arXiv Detail & Related papers (2023-11-16T06:20:13Z) - Where are We in Event-centric Emotion Analysis? Bridging Emotion Role
Labeling and Appraisal-based Approaches [10.736626320566707]
The term emotion analysis in text subsumes various natural language processing tasks.
We argue that emotions and events are related in two ways.
We discuss how to incorporate psychological appraisal theories in NLP models to interpret events.
arXiv Detail & Related papers (2023-09-05T09:56:29Z) - Speech Synthesis with Mixed Emotions [77.05097999561298]
We propose a novel formulation that measures the relative difference between the speech samples of different emotions.
We then incorporate our formulation into a sequence-to-sequence emotional text-to-speech framework.
At run-time, we control the model to produce the desired emotion mixture by manually defining an emotion attribute vector.
arXiv Detail & Related papers (2022-08-11T15:45:58Z) - SOLVER: Scene-Object Interrelated Visual Emotion Reasoning Network [83.27291945217424]
We propose a novel Scene-Object interreLated Visual Emotion Reasoning network (SOLVER) to predict emotions from images.
To mine the emotional relationships between distinct objects, we first build up an Emotion Graph based on semantic concepts and visual features.
We also design a Scene-Object Fusion Module to integrate scenes and objects, which exploits scene features to guide the fusion process of object features with the proposed scene-based attention mechanism.
arXiv Detail & Related papers (2021-10-24T02:41:41Z) - Emotion Recognition under Consideration of the Emotion Component Process
Model [9.595357496779394]
We use the emotion component process model (CPM) by Scherer (2005) to explain emotion communication.
CPM states that emotions are a coordinated process of various subcomponents, in reaction to an event, namely the subjective feeling, the cognitive appraisal, the expression, a physiological bodily reaction, and a motivational action tendency.
We find that emotions on Twitter are predominantly expressed by event descriptions or subjective reports of the feeling, while in literature, authors prefer to describe what characters do, and leave the interpretation to the reader.
arXiv Detail & Related papers (2021-07-27T15:53:25Z) - A Circular-Structured Representation for Visual Emotion Distribution
Learning [82.89776298753661]
We propose a well-grounded circular-structured representation to utilize the prior knowledge for visual emotion distribution learning.
To be specific, we first construct an Emotion Circle to unify any emotional state within it.
On the proposed Emotion Circle, each emotion distribution is represented with an emotion vector, which is defined with three attributes.
arXiv Detail & Related papers (2021-06-23T14:53:27Z) - Enhancing Cognitive Models of Emotions with Representation Learning [58.2386408470585]
We present a novel deep learning-based framework to generate embedding representations of fine-grained emotions.
Our framework integrates a contextualized embedding encoder with a multi-head probing model.
Our model is evaluated on the Empathetic Dialogue dataset and shows the state-of-the-art result for classifying 32 emotions.
arXiv Detail & Related papers (2021-04-20T16:55:15Z) - Controllable Multi-Character Psychology-Oriented Story Generation [28.054245616281023]
We present a novel model-based attention mechanism that we call SoCP (Storytelling of multi-Character Psychology)
We show that the proposed model can generate stories considering the changes in the psychological state of different characters.
Experiments show that with SoCP, the generated stories follow the psychological state for each character according to both automatic and human evaluations.
arXiv Detail & Related papers (2020-10-11T12:05:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.