A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2017; you can also visit the original URL.
The file type is application/pdf
.
Saliency-driven unstructured acoustic scene classification using latent perceptual indexing
2009
2009 IEEE International Workshop on Multimedia Signal Processing
Automatic acoustic scene classification of real life, complex and unstructured acoustic scenes is a challenging task as the number of acoustic sources present in the audio stream are unknown and overlapping in time. In this work, we present a novel approach to classification such unstructured acoustic scenes. Motivated by the bottom-up attention model of the human auditory system, salient events of an audio clip are extracted in an unsupervised manner and presented to the classification system.
doi:10.1109/mmsp.2009.5293267
dblp:conf/mmsp/KalinliSN09
fatcat:5y2m36c57bgs5azzj4lnvg3yre