Detecting low-quality crowdtesting workers

Ricky K. P. Mok, Weichao Li, Rocky K. C. Chang
2015 2015 IEEE 23rd International Symposium on Quality of Service (IWQoS)  
QoE crowdtesting is increasingly popular among researchers to conduct subjective assessments of different services. Experimenters can easily access to a huge pool of human subjects through crowdsourcing platforms. A fundamental problem threatening the integrity of crowdtesting is to detect cheating from the workers who work without any supervision. One of the approaches in classifying the quality of workers is analyzing their behavior during the experiments. A major challenge is to
more » ... y analyze the mouse cursor trajectory. However, existing works usually analyze the trajectory coarsely, which cannot fully extract the information imbedded in the trajectory. In this paper, we propose to use finer-grained cursor trajectory analysis, including submovement analysis, to identify low quality workers. Our approach is to define a set of ten worker behavior metrics to quantify different types of worker behavior. A jQuerybased library was implemented to collect the worker behavior. Moreover, four different 5-point Likert scale rating methods were employed. A number of methods, including question design, instructions, and human inspections, are used to label workers into three categories. We then apply multiclass Naïve Bayes classifier to construct different models using all or some of the metrics and the workers' category. Our results show that the error rates of the model trained from four metrics is equal or less than 30% for four rating methods. By combining the predictions from the four rating methods, the successful rate in detecting low-quality workers is around 80%.
doi:10.1109/iwqos.2015.7404734 dblp:conf/iwqos/MokLC15 fatcat:6odnxyti6re7hntbmrwpendz7y