Improving Worker Engagement Through Conversational Microtask Crowdsourcing

Sihang Qiu, Ujwal Gadiraju, Alessandro Bozzon
2020 Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems  
The rise in popularity of conversational agents has enabled humans to interact with machines more naturally. Recent work has shown that crowd workers in microtask marketplaces can complete a variety of human intelligence tasks (HITs) using conversational interfaces with similar output quality compared to the traditional Web interfaces. In this paper, we investigate the effectiveness of using conversational interfaces to improve worker engagement in microtask crowdsourcing. We designed a
more » ... ed conversational agent that assists workers in task execution, and tested the performance of workers when interacting with agents having different conversational styles. We conducted a rigorous experimental study on Amazon Mechanical Turk with 800 unique workers, to explore whether the output quality, worker engagement and the perceived cognitive load of workers can be affected by the conversational agent and its conversational styles. Our results show that conversational interfaces can be effective in engaging workers, and a suitable conversational style has potential to improve worker engagement. Our findings have important implications on workflows and task design with regard to better engaging workers in microtask crowdsourcing marketplaces.
doi:10.1145/3313831.3376403 dblp:conf/chi/QiuGB20 fatcat:fyqkvbocrjcfjfourxk45mhxkq