How emoji and word embedding helps to unveil emotional transitions
during online messaging
- URL: http://arxiv.org/abs/2104.11032v1
- Date: Tue, 23 Mar 2021 12:45:17 GMT
- Title: How emoji and word embedding helps to unveil emotional transitions
during online messaging
- Authors: Moeen Mostafavi and Michael D. Porter
- Abstract summary: We use Affect Control Theory (ACT) to predict emotional change during the interaction.
We extend the affective dictionaries used by ACT to let the customer use emojis.
Our framework can find emotional change during messaging and how a customer's reaction is changed accordingly.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: During online chats, body-language and vocal characteristics are not part of
the communication mechanism making it challenging to facilitate an accurate
interpretation of feelings, emotions, and attitudes. The use of emojis to
express emotional feeling is an alternative approach in these types of
communication. In this project, we focus on modeling a customer's emotion in an
online messaging session with a chatbot. We use Affect Control Theory (ACT) to
predict emotional change during the interaction. To let the customer use
emojis, we also extend the affective dictionaries used by ACT. For this
purpose, we mapped Emoji2vec embedding to the affective space. Our framework
can find emotional change during messaging and how a customer's reaction is
changed accordingly.
Related papers
- Tracking Emotional Dynamics in Chat Conversations: A Hybrid Approach using DistilBERT and Emoji Sentiment Analysis [0.0]
This paper explores a hybrid approach to tracking emotional dynamics in chat conversations by combining text emotion detection and emoji sentiment analysis.
A Twitter dataset was analyzed using various machine learning algorithms, including SVM, Random Forest, and AdaBoost.
Our findings show that integrating text and emoji analysis is an effective way of tracking chat emotion, with possible applications in customer service, work chats, and social media interactions.
arXiv Detail & Related papers (2024-08-03T18:28:31Z) - Personality-affected Emotion Generation in Dialog Systems [67.40609683389947]
We propose a new task, Personality-affected Emotion Generation, to generate emotion based on the personality given to the dialog system.
We analyze the challenges in this task, i.e., (1) heterogeneously integrating personality and emotional factors and (2) extracting multi-granularity emotional information in the dialog context.
Results suggest that by adopting our method, the emotion generation performance is improved by 13% in macro-F1 and 5% in weighted-F1 from the BERT-base model.
arXiv Detail & Related papers (2024-04-03T08:48:50Z) - Attention-based Interactive Disentangling Network for Instance-level
Emotional Voice Conversion [81.1492897350032]
Emotional Voice Conversion aims to manipulate a speech according to a given emotion while preserving non-emotion components.
We propose an Attention-based Interactive diseNtangling Network (AINN) that leverages instance-wise emotional knowledge for voice conversion.
arXiv Detail & Related papers (2023-12-29T08:06:45Z) - Empathetic Dialogue Generation via Sensitive Emotion Recognition and
Sensible Knowledge Selection [47.60224978460442]
We propose a Serial and Emotion-Knowledge interaction (SEEK) method for empathetic dialogue generation.
We use a fine-grained encoding strategy which is more sensitive to the emotion dynamics (emotion flow) in the conversations to predict the emotion-intent characteristic of response. Besides, we design a novel framework to model the interaction between knowledge and emotion to generate more sensible response.
arXiv Detail & Related papers (2022-10-21T03:51:18Z) - Emoji-based Co-attention Network for Microblog Sentiment Analysis [10.135289472491655]
We propose an emoji-based co-attention network that learns the mutual emotional semantics between text and emojis on microblogs.
Our model adopts the co-attention mechanism based on bidirectional long short-term memory incorporating the text and emojis, and integrates a squeeze-and-excitation block in a convolutional neural network to increase its sensitivity to emotional semantic features.
arXiv Detail & Related papers (2021-10-27T07:23:18Z) - Automatically Select Emotion for Response via Personality-affected
Emotion Transition [0.0]
dialog systems should be capable to automatically select appropriate emotions for responses like humans.
Most existing works focus on rendering specified emotions in responses or empathetically respond to the emotion of users, yet the individual difference in emotion expression is overlooked.
We equip the dialog system with personality and enable it to automatically select emotions in responses by simulating the emotion transition of humans in conversation.
arXiv Detail & Related papers (2021-06-30T07:00:42Z) - SentEmojiBot: Empathising Conversations Generation with Emojis [2.2623071655418734]
We propose, SentEmojiBot, to generate empathetic conversations with a combination of emojis and text.
A user study indicates that the dialogues generated by our model were understandable and adding emojis improved empathetic traits in conversations by 9.8%.
arXiv Detail & Related papers (2021-05-26T08:51:44Z) - Seen and Unseen emotional style transfer for voice conversion with a new
emotional speech dataset [84.53659233967225]
Emotional voice conversion aims to transform emotional prosody in speech while preserving the linguistic content and speaker identity.
We propose a novel framework based on variational auto-encoding Wasserstein generative adversarial network (VAW-GAN)
We show that the proposed framework achieves remarkable performance by consistently outperforming the baseline framework.
arXiv Detail & Related papers (2020-10-28T07:16:18Z) - MIME: MIMicking Emotions for Empathetic Response Generation [82.57304533143756]
Current approaches to empathetic response generation view the set of emotions expressed in the input text as a flat structure.
We argue that empathetic responses often mimic the emotion of the user to a varying degree, depending on its positivity or negativity and content.
arXiv Detail & Related papers (2020-10-04T00:35:47Z) - Converting Anyone's Emotion: Towards Speaker-Independent Emotional Voice
Conversion [83.14445041096523]
Emotional voice conversion aims to convert the emotion of speech from one state to another while preserving the linguistic content and speaker identity.
We propose a speaker-independent emotional voice conversion framework, that can convert anyone's emotion without the need for parallel data.
Experiments show that the proposed speaker-independent framework achieves competitive results for both seen and unseen speakers.
arXiv Detail & Related papers (2020-05-13T13:36:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.