Local Crowdsourcing for Annotating Audio: the Elevator Annotator platform
DOI:
https://doi.org/10.15346/hc.v6i1.1Keywords:
Crowdsourcing, Local crowdsourcing, audio annotationAbstract
Crowdsourcing and other human computation techniques have proven useful in collecting large numbers of annotations for various datasets. In the majority of cases, online platforms are used when running crowdsourcing campaigns. Local crowdsourcing is a variant where annotation is done on specific physical locations. This paper describes a local crowdsourcing concept, platform and experiment. The case setting concerns eliciting annotations for an audio archive. For the experiment, we developed a hardware platform designed to be deployed in building elevators. To evaluate the effectiveness of the platform and to test the influence of location on the annotation results, an experiment was set up in two different locations. In each location two different user interaction modalities are used. The results show that our simple local crowdsourcing setup is able to achieve acceptable accuracy levels with up to 4 annotations per hour, and that the location has a significant effect on accuracy.References
Agapie, E., Teevan, J., & Monroy-Hernández, A. (2015, September). Crowdsourcing in the field: A case study using local crowds for event reporting. In Third AAAI Conference on Human Computation and Crowdsourcing.
Anggarda, P. (2017). Experiment Results. figshare. Retrieved 1 July 2017, from https://doi.org/10.6084/m9.figshare.5106844.v1
De Boer, V., Hildebrand, M., Aroyo, L., De Leenheer, P., Dijkshoorn, C., Tesfa, B., & Schreiber, G. (2012, October). Nichesourcing: harnessing the power of crowds of experts. In International Conference on Knowledge Engineering and Knowledge Management (pp. 16-20). Springer, Berlin, Heidelberg.
Gligorov, R., Baltussen, L. B., van Ossenbruggen, J., Aroyo, L., Brinkerink, M., Oomen, J., & van Ees, A. (2010). Towards integration of end-user tags with professional annotations.
Gupta, A., Thies, W., Cutrell, E., & Balakrishnan, R. (2012, May). mClerk: enabling mobile crowdsourcing in developing regions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 1843-1852). ACM.
Heimerl, K., Gawalt, B., Chen, K., Parikh, T., & Hartmann, B. (2012, May). CommunitySourcing: engaging local crowds to perform expert work via physical kiosks. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 1539-1548). ACM.
Hildebrand, M., Brinkerink, M., Gligorov, R., Van Steenbergen, M., Huijkman, J., & Oomen, J. (2013, October). Waisda?: video labeling game. In Proceedings of the 21st ACM international conference on Multimedia (pp. 823-826). ACM.
Howe, J. (2006). The rise of crowdsourcing. Wired magazine, 14(6), 1-4.
McKay, C., & Fujinaga, I. (2005, March). Automatic music classification and the importance of instrument identification. In Proceedings of the Conference on Interdisciplinary Musicology.
Oomen, J., Belice Baltussen, L., Limonard, S., van Ees, A., Brinkerink, M., Aroyo, L., Vervaart, J., Asaf, K. & Gligorov, R. (2010). Emerging practices in the cultural heritage domain-social tagging of audiovisual heritage.
Oomen, J., & Aroyo, L. (2011, June). Crowdsourcing in the cultural heritage domain: opportunities and challenges.In Proceedings of the 5th International Conference on Communities and Technologies (pp. 138-149). ACM.
Shirky, C. (2010). Cognitive surplus: Creativity and generosity in a connected age. Penguin UK
Väätäjä, H., Vainio, T., Sirkkunen, E., & Salo, K. (2011, August). Crowdsourced news reporting: supporting news content creation with mobile phones. In Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services (pp. 435-444). ACM.
Vaish, R., Wyngarden, K., Chen, J., Cheung, B., & Bernstein, M. S. (2014, April). Twitch crowdsourcing: crowd contributions in short bursts of time. In Proceedings of the 32nd annual ACM conference on Human factors in computing systems (pp. 3645-3654). ACM.
Downloads
Published
How to Cite
Issue
Section
License
Authors who publish with this journal agree to the following terms:- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).