EmTract: Extracting Emotions from Social Media
- URL: http://arxiv.org/abs/2112.03868v3
- Date: Wed, 21 Jun 2023 22:48:39 GMT
- Title: EmTract: Extracting Emotions from Social Media
- Authors: Domonkos F. Vamossy and Rolf Skog
- Abstract summary: We develop an open-source tool (EmTract) that extracts emotions from social media text tailed for financial context.
We show that emotions and market dynamics are closely related, and we provide a tool to help study the role emotions play in financial markets.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We develop an open-source tool (EmTract) that extracts emotions from social
media text tailed for financial context. To do so, we annotate ten thousand
short messages from a financial social media platform (StockTwits) and combine
it with open-source emotion data. We then use a pre-tuned NLP model,
DistilBERT, augment its embedding space by including 4,861 tokens (emojis and
emoticons), and then fit it first on the open-source emotion data, then
transfer it to our annotated financial social media data. Our model outperforms
competing open-source state-of-the-art emotion classifiers, such as Emotion
English DistilRoBERTa-base on both human and chatGPT annotated data. Compared
to dictionary based methods, our methodology has three main advantages for
research in finance. First, our model is tailored to financial social media
text; second, it incorporates key aspects of social media data, such as
non-standard phrases, emojis, and emoticons; and third, it operates by
sequentially learning a latent representation that includes features such as
word order, word usage, and local context. Using EmTract, we explore the
relationship between investor emotions expressed on social media and asset
prices. We show that firm-specific investor emotions are predictive of daily
price movements. Our findings show that emotions and market dynamics are
closely related, and we provide a tool to help study the role emotions play in
financial markets.
Related papers
- Tracking Emotional Dynamics in Chat Conversations: A Hybrid Approach using DistilBERT and Emoji Sentiment Analysis [0.0]
This paper explores a hybrid approach to tracking emotional dynamics in chat conversations by combining text emotion detection and emoji sentiment analysis.
A Twitter dataset was analyzed using various machine learning algorithms, including SVM, Random Forest, and AdaBoost.
Our findings show that integrating text and emoji analysis is an effective way of tracking chat emotion, with possible applications in customer service, work chats, and social media interactions.
arXiv Detail & Related papers (2024-08-03T18:28:31Z) - ConText at WASSA 2024 Empathy and Personality Shared Task: History-Dependent Embedding Utterance Representations for Empathy and Emotion Prediction in Conversations [0.8602553195689513]
The WASSA shared task on empathy and emotion prediction in interactions presents an opportunity to benchmark approaches to these tasks.
We model empathy, emotion polarity and emotion intensity of each utterance in a conversation by feeding the utterance to be classified together with its conversational context.
We also model perceived counterparty empathy of each interlocutor by feeding all utterances from the conversation and a token identifying the interlocutor for which we are predicting the empathy.
arXiv Detail & Related papers (2024-07-04T10:44:59Z) - Emotion and Intent Joint Understanding in Multimodal Conversation: A Benchmarking Dataset [74.74686464187474]
Emotion and Intent Joint Understanding in Multimodal Conversation (MC-EIU) aims to decode the semantic information manifested in a multimodal conversational history.
MC-EIU is enabling technology for many human-computer interfaces.
We propose an MC-EIU dataset, which features 7 emotion categories, 9 intent categories, 3 modalities, i.e., textual, acoustic, and visual content, and two languages, English and Mandarin.
arXiv Detail & Related papers (2024-07-03T01:56:00Z) - Targeted aspect-based emotion analysis to detect opportunities and precaution in financial Twitter messages [8.504685056067144]
We propose a novel Targeted Aspect-Based Emotion Analysis (TABEA) system that can individually discern the financial emotions (positive and negative forecasts) on the different stock market assets in the same tweet.
It is based on Natural Language Processing (NLP) techniques and Machine Learning streaming algorithms.
It achieves over 90% precision for the target emotions, financial opportunity, and precaution on Twitter.
arXiv Detail & Related papers (2024-03-30T16:46:25Z) - Emotion Rendering for Conversational Speech Synthesis with Heterogeneous
Graph-Based Context Modeling [50.99252242917458]
Conversational Speech Synthesis (CSS) aims to accurately express an utterance with the appropriate prosody and emotional inflection within a conversational setting.
To address the issue of data scarcity, we meticulously create emotional labels in terms of category and intensity.
Our model outperforms the baseline models in understanding and rendering emotions.
arXiv Detail & Related papers (2023-12-19T08:47:50Z) - Language Models (Mostly) Do Not Consider Emotion Triggers When Predicting Emotion [87.18073195745914]
We investigate how well human-annotated emotion triggers correlate with features deemed salient in their prediction of emotions.
Using EmoTrigger, we evaluate the ability of large language models to identify emotion triggers.
Our analysis reveals that emotion triggers are largely not considered salient features for emotion prediction models, instead there is intricate interplay between various features and the task of emotion detection.
arXiv Detail & Related papers (2023-11-16T06:20:13Z) - StockEmotions: Discover Investor Emotions for Financial Sentiment
Analysis and Multivariate Time Series [5.892675412951627]
This paper introduces StockEmotions, a new dataset for detecting emotions in the stock market.
It consists of 10,000 English comments collected from StockTwits, a financial social media platform.
Unlike existing financial sentiment datasets, StockEmotions presents granular features such as investor sentiment classes, fine-grained emotions, emojis, and time series data.
arXiv Detail & Related papers (2023-01-23T05:32:42Z) - MAFW: A Large-scale, Multi-modal, Compound Affective Database for
Dynamic Facial Expression Recognition in the Wild [56.61912265155151]
We propose MAFW, a large-scale compound affective database with 10,045 video-audio clips in the wild.
Each clip is annotated with a compound emotional category and a couple of sentences that describe the subjects' affective behaviors in the clip.
For the compound emotion annotation, each clip is categorized into one or more of the 11 widely-used emotions, i.e., anger, disgust, fear, happiness, neutral, sadness, surprise, contempt, anxiety, helplessness, and disappointment.
arXiv Detail & Related papers (2022-08-01T13:34:33Z) - Emotion Recognition from Multiple Modalities: Fundamentals and
Methodologies [106.62835060095532]
We discuss several key aspects of multi-modal emotion recognition (MER)
We begin with a brief introduction on widely used emotion representation models and affective modalities.
We then summarize existing emotion annotation strategies and corresponding computational tasks.
Finally, we outline several real-world applications and discuss some future directions.
arXiv Detail & Related papers (2021-08-18T21:55:20Z) - Infusing Multi-Source Knowledge with Heterogeneous Graph Neural Network
for Emotional Conversation Generation [25.808037796936766]
In a real-world conversation, we instinctively perceive emotions from multi-source information.
We propose a heterogeneous graph-based model for emotional conversation generation.
Experimental results show that our model can effectively perceive emotions from multi-source knowledge.
arXiv Detail & Related papers (2020-12-09T06:09:31Z) - Modality-Transferable Emotion Embeddings for Low-Resource Multimodal
Emotion Recognition [55.44502358463217]
We propose a modality-transferable model with emotion embeddings to tackle the aforementioned issues.
Our model achieves state-of-the-art performance on most of the emotion categories.
Our model also outperforms existing baselines in the zero-shot and few-shot scenarios for unseen emotions.
arXiv Detail & Related papers (2020-09-21T06:10:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.