CogIntAc: Modeling the Relationships between Intention, Emotion and
Action in Interactive Process from Cognitive Perspective
- URL: http://arxiv.org/abs/2205.03540v1
- Date: Sat, 7 May 2022 03:54:51 GMT
- Title: CogIntAc: Modeling the Relationships between Intention, Emotion and
Action in Interactive Process from Cognitive Perspective
- Authors: Wei Peng, Yue Hu, Yuqiang Xie, Luxi Xing, Yajing Sun
- Abstract summary: We propose a novel cognitive framework of individual interaction.
The core of the framework is that individuals achieve interaction through external action driven by their inner intention.
- Score: 15.797390372732973
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Intention, emotion and action are important psychological factors in human
activities, which play an important role in the interaction between
individuals. How to model the interaction process between individuals by
analyzing the relationship of their intentions, emotions, and actions at the
cognitive level is challenging. In this paper, we propose a novel cognitive
framework of individual interaction. The core of the framework is that
individuals achieve interaction through external action driven by their inner
intention. Based on this idea, the interactions between individuals can be
constructed by establishing relationships between the intention, emotion and
action. Furthermore, we conduct analysis on the interaction between individuals
and give a reasonable explanation for the predicting results. To verify the
effectiveness of the framework, we reconstruct a dataset and propose three
tasks as well as the corresponding baseline models, including action abduction,
emotion prediction and action generation. The novel framework shows an
interesting perspective on mimicking the mental state of human beings in
cognitive science.
Related papers
- Visual-Geometric Collaborative Guidance for Affordance Learning [63.038406948791454]
We propose a visual-geometric collaborative guided affordance learning network that incorporates visual and geometric cues.
Our method outperforms the representative models regarding objective metrics and visual quality.
arXiv Detail & Related papers (2024-10-15T07:35:51Z) - AntEval: Evaluation of Social Interaction Competencies in LLM-Driven
Agents [65.16893197330589]
Large Language Models (LLMs) have demonstrated their ability to replicate human behaviors across a wide range of scenarios.
However, their capability in handling complex, multi-character social interactions has yet to be fully explored.
We introduce the Multi-Agent Interaction Evaluation Framework (AntEval), encompassing a novel interaction framework and evaluation methods.
arXiv Detail & Related papers (2024-01-12T11:18:00Z) - LEMON: Learning 3D Human-Object Interaction Relation from 2D Images [56.6123961391372]
Learning 3D human-object interaction relation is pivotal to embodied AI and interaction modeling.
Most existing methods approach the goal by learning to predict isolated interaction elements.
We present LEMON, a unified model that mines interaction intentions of the counterparts and employs curvatures to guide the extraction of geometric correlations.
arXiv Detail & Related papers (2023-12-14T14:10:57Z) - Interactive Natural Language Processing [67.87925315773924]
Interactive Natural Language Processing (iNLP) has emerged as a novel paradigm within the field of NLP.
This paper offers a comprehensive survey of iNLP, starting by proposing a unified definition and framework of the concept.
arXiv Detail & Related papers (2023-05-22T17:18:29Z) - Automatic Context-Driven Inference of Engagement in HMI: A Survey [6.479224589451863]
This paper presents a survey on engagement inference for human-machine interaction.
It entails interdisciplinary definition, engagement components and factors, publicly available datasets, ground truth assessment, and most commonly used features and methods.
It serves as a guide for the development of future human-machine interaction interfaces with reliable context-aware engagement inference capability.
arXiv Detail & Related papers (2022-09-30T10:46:13Z) - COMMA: Modeling Relationship among Motivations, Emotions and Actions in
Language-based Human Activities [12.206523349060179]
Motivations, emotions, and actions are inter-related essential factors in human activities.
We present the first study that investigates the viability of modeling motivations, emotions, and actions in language-based human activities.
arXiv Detail & Related papers (2022-09-14T07:54:20Z) - Co-Located Human-Human Interaction Analysis using Nonverbal Cues: A
Survey [71.43956423427397]
We aim to identify the nonverbal cues and computational methodologies resulting in effective performance.
This survey differs from its counterparts by involving the widest spectrum of social phenomena and interaction settings.
Some major observations are: the most often used nonverbal cue, computational method, interaction environment, and sensing approach are speaking activity, support vector machines, and meetings composed of 3-4 persons equipped with microphones and cameras, respectively.
arXiv Detail & Related papers (2022-07-20T13:37:57Z) - Modeling Intention, Emotion and External World in Dialogue Systems [14.724751780218297]
We propose a RelAtion Interaction Network (RAIN) to jointly model mutual relationships and explicitly integrate historical intention information.
The experiments on the dataset show that our model can take full advantage of the intention, emotion and action between individuals.
arXiv Detail & Related papers (2022-02-14T04:10:34Z) - Learning Graph Representation of Person-specific Cognitive Processes
from Audio-visual Behaviours for Automatic Personality Recognition [17.428626029689653]
We propose to represent the target subjects person-specific cognition in the form of a person-specific CNN architecture.
Each person-specific CNN is explored by the Neural Architecture Search (NAS) and a novel adaptive loss function.
Experimental results show that the produced graph representations are well associated with target subjects' personality traits.
arXiv Detail & Related papers (2021-10-26T11:04:23Z) - Disambiguating Affective Stimulus Associations for Robot Perception and
Dialogue [67.89143112645556]
We provide a NICO robot with the ability to learn the associations between a perceived auditory stimulus and an emotional expression.
NICO is able to do this for both individual subjects and specific stimuli, with the aid of an emotion-driven dialogue system.
The robot is then able to use this information to determine a subject's enjoyment of perceived auditory stimuli in a real HRI scenario.
arXiv Detail & Related papers (2021-03-05T20:55:48Z) - Investigating Human Response, Behaviour, and Preference in Joint-Task
Interaction [3.774610219328564]
We have designed an experiment in order to examine human behaviour and response as they interact with Explainable Planning (XAIP) agents.
We also present the results from an empirical analysis where we examined the behaviour of the two agents for simulated users.
arXiv Detail & Related papers (2020-11-27T22:16:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.