Loading…
Attending this event?
Friday August 9, 2024 3:00pm - 5:00pm IST
Authors - Avni Uplabdhee, Vaishali Singh, Palak Jain, K.R Seeja
Abstract - Emotions are inherently complex, and music has the remarkable ability to evoke multiple emotions simultaneously in listeners. Traditional methods of emotion detection, which focus on recognizing single emotions, may not adequately capture this richness and complexity in music. To address this challenge, a comparative analysis has been conducted to compare and analyze various multilabel emotion detection classifiers. These classifiers include RAKeL, multi-label backpropagation, Binary Relevance (BR), Two-Label Relevance (2BR), Label Powerset, ranking pairwise comparison, and calibrated label ranking classifier. The primary objective of this study is to evaluate and compare the effectiveness of these classifiers in detecting multiple emotions concurrently in music. By assessing their performance across different datasets and scenarios, we aim to identify the strengths and weaknesses of each classifier in handling the complexity of emotional expression in music. By examining how each classifier handles various musical elements such as tempo, key, instrumentation, and lyrical content, we can gain deeper insights into the nuances of emotional expression in music. Through this comparative analysis, insights can be gained into which classifiers are better suited for capturing the nuanced interplay of emotions present in music. This knowledge can inform the development of more sophisticated and accurate emotion detection systems, ultimately enhancing our understanding and appreciation of the emotional impact of music.
Paper Presenter
Friday August 9, 2024 3:00pm - 5:00pm IST
Virtual Room A Goa, India

Sign up or log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link