Building a Dialogue Corpus Annotated with Expressed and Experienced
Emotions
- URL: http://arxiv.org/abs/2205.11867v1
- Date: Tue, 24 May 2022 07:40:11 GMT
- Title: Building a Dialogue Corpus Annotated with Expressed and Experienced
Emotions
- Authors: Tatsuya Ide and Daisuke Kawahara
- Abstract summary: In communication, a human would recognize the emotion of an interlocutor and respond with an appropriate emotion, such as empathy and comfort.
We propose a method to build a dialogue corpus annotated with two kinds of emotions.
We collect dialogues from Twitter and annotate each utterance with the emotion that a speaker put into the utterance (expressed emotion) and the emotion that a listener felt after listening to the utterance (experienced emotion)
- Score: 12.29324895944564
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In communication, a human would recognize the emotion of an interlocutor and
respond with an appropriate emotion, such as empathy and comfort. Toward
developing a dialogue system with such a human-like ability, we propose a
method to build a dialogue corpus annotated with two kinds of emotions. We
collect dialogues from Twitter and annotate each utterance with the emotion
that a speaker put into the utterance (expressed emotion) and the emotion that
a listener felt after listening to the utterance (experienced emotion). We
built a dialogue corpus in Japanese using this method, and its statistical
analysis revealed the differences between expressed and experienced emotions.
We conducted experiments on recognition of the two kinds of emotions. The
experimental results indicated the difficulty in recognizing experienced
emotions and the effectiveness of multi-task learning of the two kinds of
emotions. We hope that the constructed corpus will facilitate the study on
emotion recognition in a dialogue and emotion-aware dialogue response
generation.
Related papers
- Think out Loud: Emotion Deducing Explanation in Dialogues [57.90554323226896]
We propose a new task "Emotion Deducing Explanation in Dialogues" (EDEN)
EDEN recognizes emotion and causes in an explicitly thinking way.
It can help Large Language Models (LLMs) achieve better recognition of emotions and causes.
arXiv Detail & Related papers (2024-06-07T08:58:29Z) - Think Twice: A Human-like Two-stage Conversational Agent for Emotional Response Generation [16.659457455269127]
We propose a two-stage conversational agent for the generation of emotional dialogue.
First, a dialogue model trained without the emotion-annotated dialogue corpus generates a prototype response that meets the contextual semantics.
Secondly, the first-stage prototype is modified by a controllable emotion refiner with the empathy hypothesis.
arXiv Detail & Related papers (2023-01-12T10:03:56Z) - Empathetic Dialogue Generation via Sensitive Emotion Recognition and
Sensible Knowledge Selection [47.60224978460442]
We propose a Serial and Emotion-Knowledge interaction (SEEK) method for empathetic dialogue generation.
We use a fine-grained encoding strategy which is more sensitive to the emotion dynamics (emotion flow) in the conversations to predict the emotion-intent characteristic of response. Besides, we design a novel framework to model the interaction between knowledge and emotion to generate more sensible response.
arXiv Detail & Related papers (2022-10-21T03:51:18Z) - Speech Synthesis with Mixed Emotions [77.05097999561298]
We propose a novel formulation that measures the relative difference between the speech samples of different emotions.
We then incorporate our formulation into a sequence-to-sequence emotional text-to-speech framework.
At run-time, we control the model to produce the desired emotion mixture by manually defining an emotion attribute vector.
arXiv Detail & Related papers (2022-08-11T15:45:58Z) - Emotion Intensity and its Control for Emotional Voice Conversion [77.05097999561298]
Emotional voice conversion (EVC) seeks to convert the emotional state of an utterance while preserving the linguistic content and speaker identity.
In this paper, we aim to explicitly characterize and control the intensity of emotion.
We propose to disentangle the speaker style from linguistic content and encode the speaker style into a style embedding in a continuous space that forms the prototype of emotion embedding.
arXiv Detail & Related papers (2022-01-10T02:11:25Z) - Perspective-taking and Pragmatics for Generating Empathetic Responses
Focused on Emotion Causes [50.569762345799354]
We argue that two issues must be tackled at the same time: (i) identifying which word is the cause for the other's emotion from his or her utterance and (ii) reflecting those specific words in the response generation.
Taking inspiration from social cognition, we leverage a generative estimator to infer emotion cause words from utterances with no word-level label.
arXiv Detail & Related papers (2021-09-18T04:22:49Z) - Automatically Select Emotion for Response via Personality-affected
Emotion Transition [0.0]
dialog systems should be capable to automatically select appropriate emotions for responses like humans.
Most existing works focus on rendering specified emotions in responses or empathetically respond to the emotion of users, yet the individual difference in emotion expression is overlooked.
We equip the dialog system with personality and enable it to automatically select emotions in responses by simulating the emotion transition of humans in conversation.
arXiv Detail & Related papers (2021-06-30T07:00:42Z) - Knowledge Bridging for Empathetic Dialogue Generation [52.39868458154947]
Lack of external knowledge makes empathetic dialogue systems difficult to perceive implicit emotions and learn emotional interactions from limited dialogue history.
We propose to leverage external knowledge, including commonsense knowledge and emotional lexical knowledge, to explicitly understand and express emotions in empathetic dialogue generation.
arXiv Detail & Related papers (2020-09-21T09:21:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.