An evaluation methodology for crowdsourced design

Hao Wu, Jonathan Corney, Michael Grant
2015 Advanced Engineering Informatics  
Michael (2015) An evaluation methodology for crowdsourced design. Advanced Engineering Informatics, 29 (4). pp. 775-786. ISSN 1474-0346 , http://dx.Abstract-In recent years, the "power of the crowd" has been repeatedly demonstrated and various Internet platforms have been used to support applications of collaborative intelligence in tasks ranging from open innovation to image analysis. However, crowdsourcing applications in the fields of design research and creative innovation have been much
more » ... wer to emerge. So, although there have been reports of systems and researchers using Internet crowdsourcing to carry out generative design, there are still many gaps in knowledge about the capability and limitations of the technology. Indeed the process models developed to support traditional commercial design (e.g. Pugh's Total Design, Agile, Double-Diamond etc. ) have yet to be established for Crowdsourced Design (cDesign). As a contribution to the development of such a general model this paper proposes a cDesign framework to support the creation of crowdsourced design activities. Within the cDesign framework the effective evaluation of design quality is identified as a key component that not only enables the leveraging of a large, virtual workforce's creative activities but is also fundamental to almost all iterative optimisation processes. This paper reports an experimental investigation into two different Crowdsourced design evaluation approaches; free evaluation and 'Crowdsourced Design Evaluation Criteria' (cDEC). The results are benchmarked against a 'manual' evaluation carried out by a panel of experienced designers. The results suggest that the cDEC approach produces design rankings that correlate strongly with the judgements of an "expert panel". The paper concludes that cDEC assessment methodology demonstrates how Crowdsourcing can be effectively used to evaluate, as well as generate, new design solutions. Keywords-crowdsourcing; crowdsourced design methodology; design evaluation; crowdsourced design evaluation criteria; collaborative design, human based genetic algorithm assessment of a crowdsourced design task's sensitivity to payment and evaluation methods (section 3), the results of these experiments are then presented (section 4). In section 5 the results of the experimental prototype are benchmarked against an 'expert panel's' evaluation of the results and the paper ends with conclusions and recommendations for future work in section 6.
doi:10.1016/j.aei.2015.09.005 fatcat:azkhvgvvdfg2tjw4m5qvenbtqm