5,785 Hits in 4.8 sec

Music Mood Annotator Design and Integration

Cyril Laurier, Owen Meyers, Joan Serra, Martin Blech, Perfecto Herrera
2009 2009 Seventh International Workshop on Content-Based Multimedia Indexing  
A robust and efficient technique for automatic music mood annotation is presented.  ...  In addition, the integration of a fast and scalable version of this technique with the European Project PHAROS is discussed.  ...  We are very grateful to all the human annotators that helped to create our ground truth.  ... 
doi:10.1109/cbmi.2009.45 dblp:conf/cbmi/LaurierMSBH09 fatcat:clyaidcgivaq7nqzkxnhjumene

A Multimedia Search And Navigation Prototype, Including Music And Video-Clips

Geoffroy Peeters, Frédéric Cornu, Christophe Charbuillet, Damien Tardieu, Juan José Burred, Marie Vian, Valérie Botherel, Jean-Bernard Rault, Jean-Philippe Cabanal
2012 Zenodo  
OVERALL DESIGN PROCESS Figure 1 represents the various elements of work (and interaction/dependency between them), needed for integrating the music technologies in the prototype.  ...  For the other tags (mood, instrumentation), 4000 tracks have been manually annotated by two individual professional annotators.  ... 
doi:10.5281/zenodo.1417760 fatcat:rntgn53s2re2baw5k3uclcft7a

Designing emotion awareness interface for group recommender systems

Yu Chen, Pearl Pu
2014 Proceedings of the 2014 International Working Conference on Advanced Visual Interfaces - AVI '14  
We then show that it allows users to annotate and visualize group members' emotions in GroupFun, a group music recommender.  ...  We first describe the design process behind an emotion annotation tool, which we call CoFeel.  ...  We then integrate it in GroupFun, a music group recommender systems, which allows users to annotate individual emotions and view group emotions.  ... 
doi:10.1145/2598153.2600034 dblp:conf/avi/ChenP14 fatcat:slugdsmy2bdlbejvkis6t76qpy

An Accurate Algorithm for Generating a Music Playlist based on Facial Expressions

Anukriti Dureha
2014 International Journal of Computer Applications  
Manual segregation of a playlist and annotation of songs, in accordance with the current emotional state of a user, is labor intensive and time consuming.  ...  The algorithm proposed in this paper aspires to reduce the overall computational time and the cost of the designed system. It also aims at increasing the accuracy of the designed system.  ...  K.McKay et. al [17] designed xpod-a human activity and emotion aware music player.  ... 
doi:10.5120/17557-8163 fatcat:vczoqsszdncnxg2syqotf4ttdq

Knowledge based Semantic Annotation Generation of Music

Sunitha Abburu
2012 International Journal of Computer Applications  
This raises the need for an ontology based annotation generation tool for film songs. The current research designs and implements a tool M-SAGT -Music Semantic Annotation Generation Tool.  ...  The storage capacity and the cost of the storage devises gives raise to voluminous music collection management.  ...  This paper discusses the design and implementation of music semantic annotation generation tool M-SAGT.  ... 
doi:10.5120/7206-9990 fatcat:l52l55iywrbwtnpvdnaoevcbxu

Gathering a Dataset of Multi-Modal Mood-Dependent Perceptual Responses to Music

Matevz Pesek, Primoz Godec, Mojca Poredos, Gregor Strle, Joze Guna, Emilija Stojmenova, Matevz Pogacnik, Matija Marolt
2014 User Modeling, Adaptation, and Personalization  
The paper presents a new dataset that captures the effect of mood on visual and auditory perception of music.  ...  With an online survey, we have collected a dataset of over 6600 responses capturing users' mood, emotions evoked and expressed by music and the perception of color with regard to emotions and music.  ...  To our knowledge, no currently available music-mood dataset has such a high ratio of user annotations per music excerpt.  ... 
dblp:conf/um/PesekGPSGSPM14 fatcat:cbto5agfm5fmvdopcfcjgztnyy

M-GWAP: An Online and Multimodal Game With A Purpose in WordPress for Mental States Annotation [article]

Fabio Paolizzo
2019 arXiv   pre-print
M-GWAP is a multimodal game with a purpose of that leverages on the wisdom of crowds phenomenon for the annotation of multimedia data in terms of mental states.  ...  The current version of the game was deployed after alpha and beta testing helped refining the game accordingly.  ...  This allows increasing the amount of annotations for the snippets in the Musical-Moods database that will be annotated using M-GWAP.  ... 
arXiv:1905.12884v1 fatcat:fcguzi67ybagvo65ygqnnbuwl4

Mood-Ex-Machina: Towards Automation Of Moody Tunes

Sten Govaerts, Nik Corthaut, Erik Duval
2007 Zenodo  
We gratefully acknowledge the financial support of the IWT Vlaanderen (http:///, through the project: Music Metadata Generation IWT-060237.  ...  mode and chords, and last we derive association rules out of the annotated music.  ...  Current work includes a conceptual design stage of human algorithms for mood classification in the form of a collaborative, competitive game [4] .  ... 
doi:10.5281/zenodo.1415918 fatcat:ctgjq2hs3jfermwbtsxi55um2m


Rajiv Ratn Shah, Yi Yu, Roger Zimmermann
2014 Proceedings of the ACM International Conference on Multimedia - MM '14  
First, we predict scene moods from a real-world video dataset that was collected from users' daily outdoor activities.  ...  Capturing videos anytime and anywhere, and then instantly sharing them online, has become a very popular activity.  ...  Suhua Tang and the anonymous reviewers for their insightful and constructive suggestions to improve the quality of this work.  ... 
doi:10.1145/2647868.2654919 dblp:conf/mm/ShahYZ14 fatcat:w6miwqxozbfidj5slk2zvdeb74

Slides for the PhD defence of MIRages: an account of music audio extractors, semantic description and context-awareness, in the three ages of MIR

Perfecto Herrera
2018 Zenodo  
Indexing Music by Mood: Design and Integration of an Automatic Content-based Annotator. Multimedia Tools and Applications. 48(1), 161-184.  ...  and vamp plugins for easy extension/integration/prototyping Music Technology Group 2.  ...  Music Technology Group Mood Koelsch, S., Skouras S., Fritz T., Herrera P., Bonhage C., Küssner M. B., et al. (2013) .  ... 
doi:10.5281/zenodo.2527496 fatcat:ext24euysvbdrefz2xhyrnkgqi

Facial Emotion Based Music Recommendation System using computer vision and machine learning techiniques

ShanthaShalini. K, Et. al.
2021 Turkish Journal of Computer and Mathematics Education  
Integration of feature extraction and machine learning techniques, from the real face the emotion are detected and once the mood is derived from the input image, respective songs for the specific mood  ...  For experimental results, we use openCV for emotion detection and music recommendation.  ...  This research paper suggested MoodPlay, a music recommendation framework that takes into account both the user's mood and the music they are listening to.  ... 
doi:10.17762/turcomat.v12i2.1101 fatcat:4gageuy475ag7gbgqn4em74fiy

Introducing A Dataset Of Emotional And Color Responses To Music

Matevz Pesek, Primoz Godec, Mojca Poredos, Gregor Strle, Joze Guna, Emilija Stojmenova, Matevz Pogacnik, Matija Marolt
2014 Zenodo  
Serra, "Indexing music by mood: design and integration of an automatic content-based annotator," Multimedia Tools Appl., vol. 48, pp. 161-184, 2010. 15th International Society for Music Information Retrieval  ...  To our knowledge, no currently available mood-music dataset has such a high ratio of user annotations per music excerpt.  ... 
doi:10.5281/zenodo.1415925 fatcat:wjnxrc4uqnd6fokztp7ghflb5m

From Sensors to Songs: A Learning-Free Novel Music Recommendation System using Contextual Sensor Data

Abhishek Sen, Martha A. Larson
2015 ACM Conference on Recommender Systems  
This paper motivates and describes the design for a mobile application along with a description of tests that will be carried out for validation.  ...  Even with all the music content available on the web and commercial music streaming services, discovering new music remains a time consuming and taxing activity for the average user.  ...  SoundCloud has a music database of over 100 million songs, which are richly annotated with tags.  ... 
dblp:conf/recsys/SenL15 fatcat:zgqyhwhfdjf5jplzcl3424cpiq

Mood Glove: A haptic wearable prototype system to enhance mood music in film

Antonella Mazzoni, Nick Bryan-Kinns
2016 Entertainment Computing  
We present the design and implementation of a haptic wearable prototype system which aims to amplify mood music in film through haptic sensations (vibrotactile feedback).  ...  This is an exploratory work aimed at enhancing mood music in film entertainment.  ...  Acknowledgment This work is supported by the EPSRC Doctoral Training Centre in Digital Music and Media for the Creative Economy (EP/ G03723X/1).  ... 
doi:10.1016/j.entcom.2016.06.002 fatcat:g3j4ykivrzbkzkwwqnbset6qmu

Music emotion classification for Turkish songs using lyrics

Ahmet Onur Durahim, Abide Coşkun Setirek, Birgül Başarır Özel, Hanife Kebapçı
2018 Pamukkale University Journal of Engineering Sciences  
For this purpose, first 300 songs are selected and annotated by human taggers with respect to their perceived emotions.  ...  Consequently, the form of music retrieval is changed from catalogue based searches to searches made based on emotion tags in order for easy and effective musical information access.  ...  of up to 0.70 and 0.50 with arousal and valence annotations for music mood classification, respectively.  ... 
doi:10.5505/pajes.2017.15493 fatcat:mwxmp6jz3zaxzj6fd2b3vl5bm4
« Previous Showing results 1 — 15 out of 5,785 results