Exploring the correlation between the type of music and the emotions evoked: A study using subjective questionnaires and EEG
- URL: http://arxiv.org/abs/2510.26304v1
- Date: Thu, 30 Oct 2025 09:43:56 GMT
- Title: Exploring the correlation between the type of music and the emotions evoked: A study using subjective questionnaires and EEG
- Authors: Jelizaveta Jankowska, Bożena Kostek, Fernando Alonso-Fernandez, Prayag Tiwari,
- Abstract summary: The aim is to demonstrate the impact of different music genres on emotions.<n>This had the effect of capturing a wide range of emotional responses to music.<n>The analysis revealed connections between emotions and observed brain activity.
- Score: 51.04304650881465
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The subject of this work is to check how different types of music affect human emotions. While listening to music, a subjective survey and brain activity measurements were carried out using an EEG helmet. The aim is to demonstrate the impact of different music genres on emotions. The research involved a diverse group of participants of different gender and musical preferences. This had the effect of capturing a wide range of emotional responses to music. After the experiment, a relationship analysis of the respondents' questionnaires with EEG signals was performed. The analysis revealed connections between emotions and observed brain activity.
Related papers
- Music Interpretation and Emotion Perception: A Computational and Neurophysiological Investigation [1.2000613456354128]
This study investigates emotional expression and perception in music performance using computational and neurophysiological methods.<n>The influence of different performance settings, such as repertoire, diatonic modal etudes, and improvisation, as well as levels of expressiveness, on performers' emotional communication and listeners' reactions is explored.
arXiv Detail & Related papers (2025-05-16T15:30:38Z) - Think out Loud: Emotion Deducing Explanation in Dialogues [57.90554323226896]
We propose a new task "Emotion Deducing Explanation in Dialogues" (EDEN)
EDEN recognizes emotion and causes in an explicitly thinking way.
It can help Large Language Models (LLMs) achieve better recognition of emotions and causes.
arXiv Detail & Related papers (2024-06-07T08:58:29Z) - SemEval-2024 Task 3: Multimodal Emotion Cause Analysis in Conversations [53.60993109543582]
SemEval-2024 Task 3, named Multimodal Emotion Cause Analysis in Conversations, aims at extracting all pairs of emotions and their corresponding causes from conversations.
Under different modality settings, it consists of two subtasks: Textual Emotion-Cause Pair Extraction in Conversations (TECPE) and Multimodal Emotion-Cause Pair Extraction in Conversations (MECPE)
In this paper, we introduce the task, dataset and evaluation settings, summarize the systems of the top teams, and discuss the findings of the participants.
arXiv Detail & Related papers (2024-05-19T09:59:00Z) - Exploring and Applying Audio-Based Sentiment Analysis in Music [0.0]
The ability of a computational model to interpret musical emotions is largely unexplored.
This study seeks to (1) predict the emotion of a musical clip over time and (2) determine the next emotion value after the music in a time series to ensure seamless transitions.
arXiv Detail & Related papers (2024-02-22T22:34:06Z) - Exploring the Emotional Landscape of Music: An Analysis of Valence
Trends and Genre Variations in Spotify Music Data [0.0]
This paper conducts an intricate analysis of musical emotions and trends using Spotify music data.
Employing regression modeling, temporal analysis, mood transitions, and genre investigation, the study uncovers patterns within music-emotion relationships.
arXiv Detail & Related papers (2023-10-29T15:57:31Z) - Affective Idiosyncratic Responses to Music [63.969810774018775]
We develop methods to measure affective responses to music from over 403M listener comments on a Chinese social music platform.
We test for musical, lyrical, contextual, demographic, and mental health effects that drive listener affective responses.
arXiv Detail & Related papers (2022-10-17T19:57:46Z) - An EEG-Based Multi-Modal Emotion Database with Both Posed and Authentic
Facial Actions for Emotion Analysis [13.448832812713501]
We propose to develop a new database by collecting facial expressions, action units, and EEGs simultaneously.
The database will be released to the research community to advance the state of the art for automatic emotion recognition.
arXiv Detail & Related papers (2022-03-29T18:02:12Z) - SOLVER: Scene-Object Interrelated Visual Emotion Reasoning Network [83.27291945217424]
We propose a novel Scene-Object interreLated Visual Emotion Reasoning network (SOLVER) to predict emotions from images.
To mine the emotional relationships between distinct objects, we first build up an Emotion Graph based on semantic concepts and visual features.
We also design a Scene-Object Fusion Module to integrate scenes and objects, which exploits scene features to guide the fusion process of object features with the proposed scene-based attention mechanism.
arXiv Detail & Related papers (2021-10-24T02:41:41Z) - A Circular-Structured Representation for Visual Emotion Distribution
Learning [82.89776298753661]
We propose a well-grounded circular-structured representation to utilize the prior knowledge for visual emotion distribution learning.
To be specific, we first construct an Emotion Circle to unify any emotional state within it.
On the proposed Emotion Circle, each emotion distribution is represented with an emotion vector, which is defined with three attributes.
arXiv Detail & Related papers (2021-06-23T14:53:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.