Crowdsourcing

Crowdsourcing

Beyond citizen science, the Crowdsourcing project explores the more general mechanisms of task design, incentives, and user motivation that enable effective online crowdsourcing. In particular, work within this project has explored the role of gamification in making paid crowdsourcing more effective and enjoyable for the users carrying out tasks.

This work has shown how incentives related to performance feedback, task rewards, and social comparisons can be varied to influence retention rates. Another line of work within this project has explored crowdsourcing specifically in relation to data management and curation. Here, our work has shown how changing the workflows of crowd workers enhances the quality of crowd-generated data classifications, for example in our case study of DBpedia.

Publications
Pagliari, C., & Vijaykumar S. (2016).  Digital Participatory Surveillance and the Zika Crisis: Opportunities and Caveats.
Feyisetan, O., & Simperl E. (2016).  Please Stay vs Let's Play: Social Pressure Incentives in Paid Collaborative Crowdsourcing. Lecture Notes in Computer Science. 405–412.
Feyisetan, O., Simperl E., Van Kleek M., & Shadbolt N. (2015).  Improving Paid Microtasks Through Gamification and Adaptive Furtherance Incentives. Proceedings of the 24th International Conference on World Wide Web. 333–343.
Luczak-Roesch, M., Tinati R., & Shadbolt N. (2015).  When Resources Collide: Towards a Theory of Coincidence in Information Spaces. Proceedings of the 24th International Conference on World Wide Web. 1137–1142.
Luczak-Roesch, M., Tinati R., Van Kleek M., & Shadbolt N. (2015).  From Coincidence to Purposeful Flow? Properties of Transcendental Information Cascades. Proceedings of the 2015 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining 2015. 633–638.
Dragan, L., Luczak-Roesch M., Simperl E., Packer H. S., & Moreau L. (2015).  A-posteriori Provenance-enabled Linking of Publications and Datasets Via Crowdsourcing. D-Lib Magazine. 21,
Pagliari, C. (2015).  Digital Support for Nepal: The Human Face of Big Data.
Dragan, L., Luczak-Roesch M., Simperl E., Berendt B., & Moreau L. (2014).  Crowdsourcing data citation graphs using provenance. Provenance Analytics (ProvAnalytics2014).
Evans, M. Byrne, O'Hara K., Tiropanis T., & Webber C. (2013).  Crime Applications and Social Machines: Crowdsourcing Sensitive Data. Proceedings of the 22Nd International Conference on World Wide Web. 891–896.
Hare, J., Lewis P. H., Acosta M., Weston A., Dupplaw D., Simperl E., et al. (2013).  An investigation of techniques that aim to improve the quality of labels provided by the crowd..
Acosta, M., Zaveri A., Simperl E., Kontokostas D., Auer S\""oren., & Lehmann J. (2013).  Crowdsourcing Linked Data Quality Assessment. Proceedings of the 12th International Semantic Web Conference - Part II. 260–276.
Acosta, M., Aroyo L., Bernstein A., Lehmann J., Noy N. F., & Simperl E. (2013).  CrowdSem 2013 Crowdsourcing the Semantic Web..
Sarasua, C., Simperl E., & Noy N. F. (2012).  CrowdMap: Crowdsourcing Ontology Alignment with Microtasks. (Cudré-Mauroux, P., Heflin J., Sirin E., Tudorache T., Euzenat J., Hauswirth M., et al., Ed.).The Semantic Web – ISWC 2012: 11th International Semantic Web Conference, Boston, MA, USA, November 11-15, 2012, Proceedings, Part I. 525–541.
-Administrator, SOCIAM. (2010).  Crowdsourcing provenance of distributed scholarly contributions - An experiment in the context of USEWOD.