Robotic Musicianship - Musical Interactions Between Humans and Machines [chapter]

Gil Weinberg
2007 Human Robot Interaction  
Introduction The Robotic Musicianship project aims to facilitate meaningful musical interactions between humans and machines, leading to novel musical experiences and outcomes. The project combines computational modelling of music perception, interaction, and improvisation, with the capacity to produce acoustic responses in physical and visual manners. The motivation for this work is based on the hypothesis that real-time collaboration between human and robotic players can capitalize on the
more » ... ination of their unique strengths to produce new and compelling music. Our goal is to combine human qualities such musical expression and emotions with robotic traits such as powerful processing, the ability to perform sophisticated mathematical transformations, robust long-term memory, and the capacity to play accurately without practice. A similar musical interaction can be achieved with software applications that do not involve mechanical operations. However, software-based interactive music systems are hampered by their inanimate nature, which does not provide players and audiences with physical and visual cues that are essential for creating expressive musical interactions. For example, motion size often corresponds to loudness and gesture location often relates to pitch. These cues provide visual feedback, help performers anticipate and coordinate their playing, and create an engaging musical experience by providing a visual connection to the generated sound. Software based interactive music systems are also limited by the electronic reproduction and amplification of sound through speakers, which cannot fully capture the richness of acoustic sound. Unlike these systems, the anthropomorphic m u s i c a l r o b o t w e d e v e l o p e d , n a m e d H a i l e , i s d e s i g n e d t o c r e a t e a c o u s t i c a l l y r i c h interactions with humans. The acoustic richness is achieved due to the complexities of real life systems, as opposed to digital audio nuances that require intricate design and that are limited by the fidelity and orientation of speakers. In order to create intuitive as well as inspiring social collaboration with humans, Haile is designed to analyze music based on computational models of human perception and to generate algorithmic responses that are unlikely to be played by humans ("listen like a human, improvise like a machine"). It is designed to serve as a test-bed for novel forms of musical human-machine interactions, bringing perceptual aspects of computer music into the physical world both visually and acoustically. We believe that this approach can lead to new musical experiences, and to new music, which cannot be conceived by traditional means.
doi:10.5772/5206 fatcat:u7mvqvaxrnb3xbuvkbqpijwf5m