Identifying and Accounting for Task-Dependent Bias in Crowdsourcing

Ece Kamar, Ashish Kapoor, Eric Horvitz
2015 AAAI Conference on Human Computation & Crowdsourcing  
Models for aggregating contributions by crowd workers have been shown to be challenged by the rise of taskspecific biases or errors. Task-dependent errors in assessment may shift the majority opinion of even large numbers of workers to an incorrect answer. We introduce and evaluate probabilistic models that can detect and correct task-dependent bias automatically. First, we show how to build and use probabilistic graphical models for jointly modeling task features, workers' biases, worker
more » ... butions and ground truth answers of tasks so that task-dependent bias can be corrected. Second, we show how the approach can perform a type of transfer learning among workers to address the issue of annotation sparsity. We evaluate the models with varying complexity on a large data set collected from a citizen science project and show that the models are effective at correcting the task-dependent worker bias. Finally, we investigate the use of active learning to guide the acquisition of expert assessments to enable automatic detection and correction of worker bias.
dblp:conf/hcomp/KamarKH15 fatcat:2g3ujgltkjdn5expjprgtiej4q