A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2016; you can also visit the original URL.
The file type is
How to measure and model the similarity between different music items is one of the most fundamental yet challenging research problems in music information retrieval. This paper demonstrates a novel multimodal and adaptive music similarity measure (CompositeMap) with its application in a personalized multimodal music search system. Compos-iteMap can effectively combine music properties from different aspects into compact signatures via supervised learning, which lays the foundation fordoi:10.1145/1631272.1631474 dblp:conf/mm/ZhangXWS09 fatcat:m6sl5qh2yrhh3eeaw6u5l47mje