A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2017; you can also visit the original URL.
The file type is application/pdf
.
Gesture-based control of physical modeling sound synthesis
2013
Proceedings of the 21st ACM international conference on Multimedia - MM '13
We address the issue of mapping between gesture and sound for gesture-based control of physical modeling sound synthesis. We propose an approach called mapping by demonstration, allowing users to design the mapping by performing gestures while listening to sound examples. The system is based on a multimodal model able to learn the relationships between gestures and sounds.
doi:10.1145/2502081.2502262
dblp:conf/mm/FrancoiseSB13
fatcat:itwbftotrzdp3dwbw2gof3vxfy