Crowdsourcing and Human-Centred Experiments (Dagstuhl Seminar 15481)

Daniel Archambault, Tobias Hoßfeld, Helen C. Purchase, Marc Herbstritt
2016 Dagstuhl Reports  
This report documents the program and the outcomes of Dagstuhl Seminar 15481 "Evaluation in the Crowd: Crowdsourcing and Human-Centred Experiments". Human-centred empirical evaluations play important roles in the fields of human-computer interaction, visualization, graphics, multimedia, and psychology. The advent of crowdsourcing platforms, such as Amazon Mechanical Turk or Microworkers, has provided a revolutionary methodology to conduct human-centred experiments. Through such platforms,
more » ... ments can now collect data from hundreds, even thousands, of participants from a diverse user community over a matter of weeks, greatly increasing the ease with which we can collect data as well as the power and generalizability of experimental results. However, such an experimental platform does not come without its problems: ensuring participant investment in the task, defining experimental controls, and understanding the ethics behind deploying such experiments en-masse. The major interests of the seminar participants were focused in different working groups on (W1)
doi:10.4230/dagrep.5.11.103 dblp:journals/dagstuhl-reports/ArchambaultHP15 fatcat:czy7tiam2bg3vm2m4yf5vdkmgi