Crowdsourcing graphical perception

Jeffrey Heer, Michael Bostock
2010 Proceedings of the 28th international conference on Human factors in computing systems - CHI '10  
Understanding perception is critical to effective visualization design. With its low cost and scalability, crowdsourcing presents an attractive option for evaluating the large design space of visualizations; however, it first requires validation. In this paper, we assess the viability of Amazon's Mechanical Turk as a platform for graphical perception experiments. We replicate previous studies of spatial encoding and luminance contrast and compare our results. We also conduct new experiments on
more » ... ectangular area perception (as in treemaps or cartograms) and on chart size and gridline spacing. Our results demonstrate that crowdsourced perception experiments are viable and contribute new insights for visualization design. Lastly, we report cost and performance data from our experiments and distill recommendations for the design of crowdsourced studies.
doi:10.1145/1753326.1753357 dblp:conf/chi/HeerB10 fatcat:yxmuixtcpfcpjiezgu7xv6edey