A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2018; you can also visit the original URL.
The file type is
We present an adaptive musical collaboration framework for interaction between a human and a robot. The aim of our work is to develop a system that receives feedback from the user in real time and learns the music progression style of the user over time. To tackle this problem, we represent a song as a hierarchically structured sequence of music primitives. By exploiting the sequential constraints of these primitives inferred from the structural information combined with user feedback, we showdoi:10.1109/roman.2015.7333649 dblp:conf/ro-man/SarabiaLD15 fatcat:qdjqiwtaybg43fwabzfhfudj3i