Facing the music

Michael J. Lyons, Nobuji Tetsutani
2001 CHI '01 extended abstracts on Human factors in computing systems - CHI '01  
We describe a novel musical controller which acquires live video input from the user's face, extracts facial feature parameters using a computer vision algorithm, and converts these to expressive musical effects. The controller allows the user to modify synthesized or audio-filtered musical sound in real time by moving the face.
doi:10.1145/634067.634250 dblp:conf/chi/LyonsT01 fatcat:6z5eljph7bfp3o7kniothw6rai