Emotion in Cognitive Architecture: Emergent Properties from Interactions
with Human Emotion
- URL: http://arxiv.org/abs/2301.00003v1
- Date: Wed, 28 Dec 2022 23:50:27 GMT
- Title: Emotion in Cognitive Architecture: Emergent Properties from Interactions
with Human Emotion
- Authors: Junya Morita
- Abstract summary: This document presents endeavors to represent emotion in a computational cognitive architecture.
The advantage of the cognitive human-agent interaction approach is in representing human internal states and processes.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This document presents endeavors to represent emotion in a computational
cognitive architecture. The first part introduces research organizing with two
axes of emotional affect: pleasantness and arousal. Following this basic of
emotional components, the document discusses an aspect of emergent properties
of emotion, showing interaction studies with human users. With these past
author's studies, the document concludes that the advantage of the cognitive
human-agent interaction approach is in representing human internal states and
processes.
Related papers
- SemEval-2024 Task 3: Multimodal Emotion Cause Analysis in Conversations [53.60993109543582]
SemEval-2024 Task 3, named Multimodal Emotion Cause Analysis in Conversations, aims at extracting all pairs of emotions and their corresponding causes from conversations.
Under different modality settings, it consists of two subtasks: Textual Emotion-Cause Pair Extraction in Conversations (TECPE) and Multimodal Emotion-Cause Pair Extraction in Conversations (MECPE)
In this paper, we introduce the task, dataset and evaluation settings, summarize the systems of the top teams, and discuss the findings of the participants.
arXiv Detail & Related papers (2024-05-19T09:59:00Z) - Exploring Emotions in Multi-componential Space using Interactive VR Games [1.1510009152620668]
We operationalised a data-driven approach using interactive Virtual Reality (VR) games.
We used Machine Learning (ML) methods to identify the unique contributions of each component to emotion differentiation.
These findings also have implications for using VR environments in emotion research.
arXiv Detail & Related papers (2024-04-04T06:54:44Z) - Emotion Recognition based on Psychological Components in Guided
Narratives for Emotion Regulation [0.0]
This paper introduces a new French corpus of emotional narratives collected using a questionnaire for emotion regulation.
We study the interaction of components and their impact on emotion classification with machine learning methods and pre-trained language models.
arXiv Detail & Related papers (2023-05-15T12:06:31Z) - SOLVER: Scene-Object Interrelated Visual Emotion Reasoning Network [83.27291945217424]
We propose a novel Scene-Object interreLated Visual Emotion Reasoning network (SOLVER) to predict emotions from images.
To mine the emotional relationships between distinct objects, we first build up an Emotion Graph based on semantic concepts and visual features.
We also design a Scene-Object Fusion Module to integrate scenes and objects, which exploits scene features to guide the fusion process of object features with the proposed scene-based attention mechanism.
arXiv Detail & Related papers (2021-10-24T02:41:41Z) - Emotion Recognition under Consideration of the Emotion Component Process
Model [9.595357496779394]
We use the emotion component process model (CPM) by Scherer (2005) to explain emotion communication.
CPM states that emotions are a coordinated process of various subcomponents, in reaction to an event, namely the subjective feeling, the cognitive appraisal, the expression, a physiological bodily reaction, and a motivational action tendency.
We find that emotions on Twitter are predominantly expressed by event descriptions or subjective reports of the feeling, while in literature, authors prefer to describe what characters do, and leave the interpretation to the reader.
arXiv Detail & Related papers (2021-07-27T15:53:25Z) - A Circular-Structured Representation for Visual Emotion Distribution
Learning [82.89776298753661]
We propose a well-grounded circular-structured representation to utilize the prior knowledge for visual emotion distribution learning.
To be specific, we first construct an Emotion Circle to unify any emotional state within it.
On the proposed Emotion Circle, each emotion distribution is represented with an emotion vector, which is defined with three attributes.
arXiv Detail & Related papers (2021-06-23T14:53:27Z) - Emotion-aware Chat Machine: Automatic Emotional Response Generation for
Human-like Emotional Interaction [55.47134146639492]
This article proposes a unifed end-to-end neural architecture, which is capable of simultaneously encoding the semantics and the emotions in a post.
Experiments on real-world data demonstrate that the proposed method outperforms the state-of-the-art methods in terms of both content coherence and emotion appropriateness.
arXiv Detail & Related papers (2021-06-06T06:26:15Z) - Enhancing Cognitive Models of Emotions with Representation Learning [58.2386408470585]
We present a novel deep learning-based framework to generate embedding representations of fine-grained emotions.
Our framework integrates a contextualized embedding encoder with a multi-head probing model.
Our model is evaluated on the Empathetic Dialogue dataset and shows the state-of-the-art result for classifying 32 emotions.
arXiv Detail & Related papers (2021-04-20T16:55:15Z) - Emotion pattern detection on facial videos using functional statistics [62.997667081978825]
We propose a technique based on Functional ANOVA to extract significant patterns of face muscles movements.
We determine if there are time-related differences on expressions among emotional groups by using a functional F-test.
arXiv Detail & Related papers (2021-03-01T08:31:08Z) - A Multi-Componential Approach to Emotion Recognition and the Effect of
Personality [0.0]
This paper applies a componential framework with a data-driven approach to characterize emotional experiences evoked during movie watching.
The results suggest that differences between various emotions can be captured by a few (at least 6) latent dimensions.
Results show that a componential model with a limited number of descriptors is still able to predict the level of experienced discrete emotion.
arXiv Detail & Related papers (2020-10-22T01:27:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.