Enabling Multimodal Mobile Interfaces For Musical Performance

Charles Roberts, Angus Forbes, Tobias Höllerer
2013 Zenodo  
We present research that extends the scope of the mobile application Control, aprototyping environment for defining multimodal interfaces that controlreal-time artistic and musical performances. Control allows users to rapidlycreate interfaces employing a variety of modalities, including: speechrecognition, computer vision, musical feature extraction, touchscreen widgets,and inertial sensor data. Information from these modalities can be transmittedwirelessly to remote applications. Interfaces
more » ... e declared using JSON and canbe extended with JavaScript to add complex behaviors, including the concurrentfusion of multimodal signals. By simplifying the creation of interfaces viathese simple markup files, Control allows musicians and artists to make novelapplications that use and combine both discrete and continuous data from thewide range of sensors available on commodity mobile devices.
doi:10.5281/zenodo.1178646 fatcat:5qd6pk3gbvcxplsvyo6nqshidu