A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is application/pdf
.
Crowdsourcing and Human-Centred Experiments (Dagstuhl Seminar 15481)
2016
Dagstuhl Reports
This report documents the program and the outcomes of Dagstuhl Seminar 15481 "Evaluation in the Crowd: Crowdsourcing and Human-Centred Experiments". Human-centred empirical evaluations play important roles in the fields of human-computer interaction, visualization, graphics, multimedia, and psychology. The advent of crowdsourcing platforms, such as Amazon Mechanical Turk or Microworkers, has provided a revolutionary methodology to conduct human-centred experiments. Through such platforms,
doi:10.4230/dagrep.5.11.103
dblp:journals/dagstuhl-reports/ArchambaultHP15
fatcat:czy7tiam2bg3vm2m4yf5vdkmgi