Creating Experts From the Crowd: Techniques for Finding Workers for Difficult Tasks
Title | Creating Experts From the Crowd: Techniques for Finding Workers for Difficult Tasks |
Publication Type | Journal Article |
Year of Publication | 2014 |
Authors | Gottlieb, L., Friedland G., Choi J., Kelm P., & Sikora T. |
Published in | IEEE Transactions on Multimedia |
Volume | 16 |
Issue | 7 |
Page(s) | 2075-2079 |
Other Numbers | 3745 |
Abstract | Crowdsourcing is currently used for a range of applications, either by exploiting unsolicited user-generated content, such as spontaneously annotated images, or by utilizing explicit crowdsourcing platforms such as Amazon Mechanical Turk to mass-outsource artificial-intelligence-type jobs. However, crowdsourcing is most often seen as the best option for tasks that do not require more of people than their uneducated intuition as a human being. This article describes our methods for identifying workers for crowdsourced tasks that are difficult for both machines and humans. It discusses the challenges we encountered in qualifying annotators and the steps we took to select the individuals most likely to do well at these tasks. |
Acknowledgment | This work was partially supported by funding provided through National Science Foundation EAGER grant IIS-1128599 and well as a KFAS Doctoral Study Abroad Fellowship. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors or originators and do not necessarily reflect the views of the National Science Foundation. |
Bibliographic Notes | IEEE Transactions on Multimedia, Vol. 16, Issue 7, pp. 2075-2079 |
Abbreviated Authors | L. Gottlieb, G. Friedland, J. Choi, P. Kelm, and T. Sikora |
ICSI Research Group | Audio and Multimedia |
ICSI Publication Type | Article in journal or magazine |