2,439 Hits in 10.7 sec

Empirical Analysis Of Track Selection And Ordering In Electronic Dance Music Using Audio Feature Extraction

Thor Kell, George Tzanetakis
2013 Zenodo  
DISCUSSION AND FUTURE WORK We have proposed and demonstrated the use of automatic audio feature extraction to examine the selection and order of tracks in DJ mixes.  ...  Rather than relying on user surveys, we focus on empirical analysis based on audio feature extraction.  ... 
doi:10.5281/zenodo.1415654 fatcat:vnsmjjltonel7dgr6a3jjseppm

Dance beat tracking from visual information alone

Fabrizio Pedersoli, Masataka Goto
2020 Zenodo  
Dance beat tracking aims at detecting musical beats from a dance video by using its visual information without using its audio information (i.e., dance music).  ...  The visual analysis of dances is important to achieve general machine understanding of dances, not limited to dance music.  ...  Audio-visual features were also used in the work of Shiratori et al. [47] .  ... 
doi:10.5281/zenodo.4245456 fatcat:g6kbczolajb37m3rwdnbtdssk4

Detecting Drops In Electronic Dance Music: Content Based Approaches To A Socially Significant Music Event

Karthik Yadati, Martha Larson, Cynthia C. S. Liem, Alan Hanjalic
2014 Zenodo  
INTRODUCTION Electronic dance music (EDM) is a popular genre of dance music which, as the name suggests, is created using electronic equipment and played in dance environments.  ...  Kell et al. in [6] also apply audio content analysis to EDM in order to investigate track ordering and selection, which is usually carried out by human experts, i.e., Disc Jockeys (DJ).  ... 
doi:10.5281/zenodo.1417081 fatcat:emivdpebtjhrrgmz37f66zz7qu

Musical Signal Type Discrimination based on Large Open Feature Sets

Bjorn Schuller, Frank Wallhoff, Dejan Arsic, Gerhard Rigoll
2006 2006 IEEE International Conference on Multimedia and Expo  
Automatic discrimination of musical signal types as speech, singing, music, genres or drumbeats within audio streams is of great importance e.g. for radio broadcast stream segmentation.  ...  Next, features are added by alteration and combination within genetic search. For classification we use Support-Vector-Machines proven reliable for this task.  ...  "100 Meisterwerke der klassischen Musik", 6 CDs, 100 tracks), Electronic Dance-Music (collections "Future Trance vol. 32", vol. 33, and vol. 34, 6 CDs, 126 tracks), Jazz (collection "Blue Note Jazz History  ... 
doi:10.1109/icme.2006.262724 dblp:conf/icmcs/SchullerWAR06 fatcat:6nsphamuyncwdohuirfjtr4aeq

Segmentation And Timbre Similarity In Electronic Dance Music

Bruno Rocha, Niels Bogaards, Aline Honingh
2013 Proceedings of the SMC Conferences  
This research has been funded by the Center for Digital Humanities ( and the Netherlands foundation of Scientific Research (NWO), grant no. 639.021.126.  ...  ELECTRONIC DANCE MUSIC (EDM) Electronic Dance Music (EDM) is a label that defines a metagenre encompassing a heterogeneous group of musics made with computers and electronic instruments [27] .  ...  For our purposes, we have empirically made a selection of a small number of features to describe timbre in EDM.  ... 
doi:10.5281/zenodo.850376 fatcat:xdnly5usejgqfaluhymxj3boyq

There's More to Groove than Bass in Electronic Dance Music: Why Some People Won't Dance to Techno

Brian C. Wesolowski, Alex Hofmann, Andreas B Eder
2016 PLoS ONE  
The purpose of this study was to explore the relationship between audio descriptors for groove-based electronic dance music (EDM) and raters' perceived cognitive, affective, and psychomotor responses.  ...  From 198 musical excerpts (length: 15 sec.) representing 11 subgenres of EDM, 19 low-level audio feature descriptors were extracted.  ...  , The Austrian Science Fund (FWF: P24546) and by the University of Music and Performing Arts Vienna (mdwCall2014).  ... 
doi:10.1371/journal.pone.0163938 pmid:27798645 pmcid:PMC5087899 fatcat:xa3mzonkijc5rb24b4jgue7gdm

Audio Resynthesis on the Dancefloor: A Music Structural Approach [article]

Jan-Philipp Tauscher, Stephan Wenger, Marcus Magnor
2013 International Symposium on Vision, Modeling, and Visualization  
Introducing the alignment of rhythmic and harmonic structures during transition point detection, we employ beat tracking as the analysis core component and take the human sound perception into account.  ...  We propose a method for synthesizing a novel soundtrack from an existing musical piece while preserving its structure and continuity from a music theoretical point of view.  ...  Since one important use case of our method is the rearrangement of dance music for choreographies, our song selection features a large number of dance styles included in the official training programme  ... 
doi:10.2312/pe.vmv.vmv13.041-048 dblp:conf/vmv/TauscherWM13 fatcat:pz5744wjgnbppba3kmiy4m7pfa

Multimodal Deep Learning for Music Genre Classification

Sergio Oramas, Francesco Barbieri, Oriol Nieto, Xavier Serra
2018 Transactions of the International Society for Music Information Retrieval  
Music genre labels are useful to organize songs, albums, and artists into broader groups that share similar musical characteristics.  ...  Intermediate representations of deep neural networks are learned from audio tracks, text reviews, and cover art images, and further combined for classification.  ...  Acknowledgements This work was partially funded by the Spanish Ministry of Economy and Competitiveness under the Maria de Maeztu Units of Excellence Programme (MDM-2015-0502).  ... 
doi:10.5334/tismir.10 fatcat:xfkr3e3atne3hbiwoyaxqv35za

Multi-Modal Music Information Retrieval: Augmenting Audio-Analysis with Visual Computing for Improved Music Video Analysis [article]

Alexander Schindler
2020 arXiv   pre-print
A series of comprehensive experiments and evaluations are conducted which are focused on the extraction of visual information and its application in different MIR tasks.  ...  Additionally, new visual features are introduced capturing rhythmic visual patterns. In all of these experiments the audio-based results serve as benchmark for the visual and audio-visual approaches.  ...  It is a standard electronic dance music (EDM) track. The video is situated in a dance club. The main plot of the video is to show women dancing to the music.  ... 
arXiv:2002.00251v1 fatcat:6cz6rivc3fbg7fahdsnokxfrk4

Content Based Record Label Classification for Electronic Music

Georges Naimeh, Perfecto Herrera, Minz Won
2020 Zenodo  
We also propose various ways of segmenting Electronic Music, in order to capture the most relevant features.  ...  The research presented in this dissertation aims at the ex- ploration of the usage of Music Information Retrieval tools and tech- niques for the analysis of Electronic Music Record Label based on audio  ...  Nonetheless, nonaudio based electronic music similarity would entail the taxonomy used to describe those tracks without relying on the content and the analysis of the audio file itself.  ... 
doi:10.5281/zenodo.4091348 fatcat:c65sjtsc5vfxha47kpgjmrbusm

Evolving structures for electronic dance music

Arne Eigenfeldt, Philippe Pasquier
2013 Proceeding of the fifteenth annual conference on Genetic and evolutionary computation conference - GECCO '13  
We present GESMI (Generative Electronica Statistical Modeling Instrument), a software system that generates Electronic Dance Music (EDM) using evolutionary methods.  ...  Lastly, we describe how the user can use contextually-relevant target features to query the generated database of strong individual patterns.  ...  ACKNOWLEDGEMENTS This research was funded by a grant from the Canada Council for the Arts, and the Natural Sciences and Engineering Research Council of Canada.  ... 
doi:10.1145/2463372.2463415 dblp:conf/gecco/EigenfeldtP13 fatcat:52yshzvenvb3nmpew2vpxqujva

Neural And Music Correlates Of Music-Evoked Emotions

Konstantinos Patlatzoglou, Dr. Rafael Ramirez
2016 Zenodo  
Using fMRI data obtained from 17 individuals during a music listening session of 24 tracks (which belong to 3 classes of joy, fear and neutral stimuli), along with the extraction of audio descriptors from  ...  By training multiple linear regressions, a predictive relationship is achieved between the extracted musical features and the BOLD activation of fMRI images, that correspond to each stimulus-track.  ...  For the audio analysis and automatic extraction of the descriptors, MIRtoolbox 1.6.1 [38] was used in Matlab.  ... 
doi:10.5281/zenodo.1161286 fatcat:elnq7oi5nbakdgqaa6xphc6mne

Exploring structural sections in EDM DJ-created radio mixes with the help of automatic music descriptors

Vincent Zurita, Perfecto Herrera, Giuseppe Bandiera
2016 Zenodo  
In this work, structural sections based on "levels of emotional experience" in electronic dance music (EDM) are investigated by analyzing DJ-created radio mixes using automatic music descriptors in order  ...  to get audio content information.  ...  Automatic music analysis techniques: state of the art In order to extract information from big catalogues of music, tools exist that can automatically extract relevant data from the audio tracks.  ... 
doi:10.5281/zenodo.3755600 fatcat:lqhdbga3t5gzxfpxjehke2zcca

Deep Learning-Based Automatic Downbeat Tracking: A Brief Review [article]

Bijue Jia, Jiancheng Lv, Dayiheng Liu
2019 arXiv   pre-print
Previous researches either focus on feature engineering (extracting certain features by signal processing, which are semi-automatic solutions); or have some limitations: they can only model music audio  ...  Thereinto, downbeat tracking has been a fundamental and continuous problem in Music Information Retrieval (MIR) area.  ...  We don't analysis the feature extraction methods here since they are common methods in audio signal processing for music applications.  ... 
arXiv:1906.03870v1 fatcat:torfkkzd6jfr3pxa5oku4k3e6i

Content-Based Music Information Retrieval: Current Directions and Future Challenges

M.A. Casey, R. Veltkamp, M. Goto, M. Leman, C. Rhodes, M. Slaney
2008 Proceedings of the IEEE  
This paper outlines the problems of content-based music information retrieval and explores the state-of-the-art methods using audio cues (e.g., query by humming, audio fingerprinting, content-based music  ...  Some of the music collections available are approaching the scale of ten million tracks and this has posed a major challenge for searching, retrieving, and organizing music content.  ...  Because music notation is more symbolic than audio, the feature extraction and analysis are different. However, the matching of derived features can be very similar in both domains. A.  ... 
doi:10.1109/jproc.2008.916370 fatcat:ynpw7lyf6fchfdzzl5jll22cee
« Previous Showing results 1 — 15 out of 2,439 results