A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2017; you can also visit the original URL.
The file type is application/pdf
.
Facing the music
2001
CHI '01 extended abstracts on Human factors in computing systems - CHI '01
We describe a novel musical controller which acquires live video input from the user's face, extracts facial feature parameters using a computer vision algorithm, and converts these to expressive musical effects. The controller allows the user to modify synthesized or audio-filtered musical sound in real time by moving the face.
doi:10.1145/634067.634250
dblp:conf/chi/LyonsT01
fatcat:6z5eljph7bfp3o7kniothw6rai