A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2017; you can also visit the original URL.
The file type is application/pdf
.
Crowd vs. experts
2014
Proceedings of the 23rd International Conference on World Wide Web - WWW '14 Companion
The results of our exploratory study provide new insights to crowdsourcing knowledge intensive tasks. We designed and performed an annotation task on a print collection of the Rijksmuseum Amsterdam, involving experts and crowd workers in the domain-specific description of depicted flowers. We created a testbed to collect annotations from flower experts and crowd workers and analyzed these in regard to user agreement. The findings show promising results, demonstrating how, for given categories,
doi:10.1145/2567948.2576960
dblp:conf/www/OostermanBHNDALT14
fatcat:a4zb5ofqxbdnpd45afzohdwgfy