"Local Crowdsourcing for Audio Annotation: the Elevator Annotator Platform"

Themistoklis Karavellas, Anggarda Prameswari, Oana Inel, Victor de Boer
2019 Human Computation  
Crowdsourcing is useful in collecting large numbers of annotations for various datasets. Local crowdsourcing is a variant where annotations are done at specific physical locations. This paper describes a local crowdsourcing concept, platform and experiment to gather annotations for an audio archive. For the experiment, we developed a hardware platform and its supporting software functionality, designed to be deployed in building elevators. To evaluate the effectiveness of the platform, test the
more » ... impact of location, and the interaction interface on the annotation results, we set up an experiment in two locations. In each location we used two different user interaction modalities. Our results show that our simple local crowdsourcing setup achieves significant accuracy levels and generates up to 4 annotations per hour, depicting as well the correlation between location and accuracy.
doi:10.15346/hc.v6i1.1 fatcat:kwy6fgu3ivgjjeh4wdt6iz74qu