A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is
We present research that extends the scope of the mobile application Control, aprototyping environment for defining multimodal interfaces that controlreal-time artistic and musical performances. Control allows users to rapidlycreate interfaces employing a variety of modalities, including: speechrecognition, computer vision, musical feature extraction, touchscreen widgets,and inertial sensor data. Information from these modalities can be transmittedwirelessly to remote applications. Interfacesdoi:10.5281/zenodo.1178646 fatcat:5qd6pk3gbvcxplsvyo6nqshidu