Hand And Finger Motion-Controlled Audio Mixing Interface release_tgqhu6rhw5celpmhggtcqztpr4

by Jarrod Ratcliffe

Published by Zenodo.

2014  

Abstract

This paper presents a control surface interface for music mixing using real time computer vision. Two input sensors are considered: the Leap Motion and the Microsoft Kinect. The author presents significant design considerations, including improving of the user's sense of depth and panorama, maintaining broad accessibility by integrating the system with Digital Audio Workstation (DAW) software, and implementing a system that is portable and affordable. To provide the user with a heightened sense of sound spatialization over the traditional channel strip, the concept of depth is addressed directly using the stage metaphor. Sound sources are represented as colored spheres in a graphical user interface to provide the user with visual feedback. Moving sources back and forward controls volume, while left to right controls panning. To provide broader accessibility, the interface is configured to control mixing within the Ableton Live DAW. The author also discusses future plans to expand functionality and evaluate the system.
In text/plain format

Archived Files and Locations

application/pdf   1.0 MB
file_wcxtntgomfarpgyzy3gegvjp2i
zenodo.org (repository)
web.archive.org (webarchive)
Read Archived PDF
Preserved and Accessible
Type  article-journal
Stage   published
Date   2014-06-01
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: 1d0d3a4a-f597-4e09-ba25-3618a33d0a28
API URL: JSON