A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is
We consider crowdsourcing problems where the users are asked to provide evaluations for items; the user evaluations are then used directly, or aggregated into a consensus value. Lacking an incentive scheme, users have no motive in making effort in completing the evaluations, providing inaccurate answers instead. We propose incentive schemes that are truthful and cheap: truthful as the optimal user behavior consists in providing accurate evaluations, and cheap because the truthfulness isarXiv:1608.07886v2 fatcat:ejzkigb26rfpnjoziij3y2xrsq