The Life Course: An interdisciplinary framework for broadening the scope of research on crowdwork
Keywords:crowdwork, methodology, scoping review, life course perspective, online freelancing, microwork
AbstractThis paper reports outcomes of a systematic scoping review of methodological approaches and analytical lenses used in empirical research on crowdwork. Over the past decade a growing corpus of publications spanning Social Sciences and Computer Science/HCI have empirically examined the nature of work practices and tasks within crowdwork; surfaced key individual and environmental factors underpinning workers’ decisions to engage in this form of work; developed and implemented tools to improve and extend various aspects of crowdwork, such as the design and allocation of tasks and incentives or workflows within the platforms; and contributed new techniques and know-how on data collection within crowdwork, for example, how to conduct large-scale surveys and experiments in behavioural psychology, economics or education drawing on crowdworker samples. Our initial reading of the crowdwork literature suggested that research had relied on a limited set of relatively narrow methodological approaches, mostly online experiments, surveys and interviews. Importantly, crowdwork research has tended to examine workers’ experiences as snapshots in time rather than studying these longitudinally or contextualising them historically, environmentally and developmentally. This piece-meal approach has given the research community initial descriptions and interpretations of crowdwork practices and provided an important starting point in a nascent field of study. However, the depth of research in the various areas, and the missing pieces, have yet to be systematically scoped out. Therefore, this paper systematically reviews the analytical-methodological approaches used in crowdwork research identifying gaps in these approaches. We argue that to take crowdwork research to the next level it is essential to examine crowdwork practices within the context of both individual and historical-environmental factors impacting it. To this end, methodological approaches that bridge sociological, psychological, individual, collective, online, offline, and temporal processes and practices of crowdwork are needed. The paper proposes the Life Course perspective as an interdisciplinary framework that can help address these gaps and advance research on crowdwork. The paper concludes by proposing a set of Life Course-inspired research questions to guide future studies of crowdwork.
Al-Ani, A., & Stumpp, S. (2016). Rebalancing interests and power structures on crowdworking platforms. Internet Policy Review, 5(2). https://doi.org/DOI: 10.14763/2016.2.415
Aleandri, G., & Russo, V. (2015). Autobiographical questionnaire and semi-structured interview: Comparing two instruments for educational research in difficult contexts. 7th World Conference on Educational Sciences, 197, 514–524. https://doi.org/10.1016/j.sbspro.2015.07.179
Arksey, H., & O’Malley, L. (2005). Scoping studies: Towards a methodological framework. International Journal of Social Research Methodology, 8(1), 19–32.
Barbeiro, A., & Spini, D. (2017). Calendar interviewing: A mixed methods device for a biographical approach to migration. Qualitative Research in Psychology, 14(1), 81–107. https://doi.org/10.1080/14780887.2016.1249581
Barnes, S., Green, A., & Hoyos, M. (2015). Crowdsourcing and work: Individual factors and circumstances influencing employability. New Technology, Work and Employment, 30(1), 16–31. https://doi.org/10.1111/ntwe.12043
Behrend, T. S., Sharek, D. J., Meade, A. W., & Wiebe, E. N. (2011). The viability of crowdsourcing for survey research. Behavior Research Methods, 43(3), 800–813. https://doi.org/10.3758/s13428-011-0081-0
Berg, Janine. (2016). Income security in the on-demand economy: Findings and policy lessons from a survey of crowdworkers. International Labour Organization. https://ideas.repec.org/p/ilo/ilowps/994906483402676.html
Blossfeld, H.-P. (2009). Comparative life course research: A cross-national and longitudinal perspective. In G. H. Elder & J. Z. Giele (Eds.), The Craft of Life Course Research (pp. 280–306). The Guilford Press.
Brannen, J. (2005). Mixed methods research: A discussion paper. ESRC National Centre for Research Methods, 1–30.
Brannen, J., Heptinstall, E., & Bhopal, K. (2000). Connecting children, care, and family life in later childhood. Routledge.
Bucher, E., & Fieseler, C. (2016). The flow of digital labor. New Media & Society, 19(11), 1868–1886. https://doi.org/10.1177/1461444816644566
Bucher, E., Fieseler, C., & Lutz, C. (2019). Mattering in digital labor. Journal of Managerial Psychology, 34(4), 307–324. https://doi.org/10.1108/JMP-06-2018-0265
Catallo, I., & Martinenghi, D. (2017). The dimensions of crowdsourcing task design. In J. Cabot, R. De Virgilio, & R. Torlone (Eds.), Web Engineering (pp. 394–402). Springer International Publishing.
Chandler, J. J., & Paolacci, G. (2017). Lie for a dime: When most prescreening responses are honest but most study participants are impostors. Social Psychological and Personality Science, 8(5), 500–508. https://doi.org/10.1177/1948550617698203
Chen, W.-C., Suri, S., & Gray, M. L. (2019, May). More than money: Correlation among worker demographics, motivations, and participation in online labor market. The 13th International AAAI Conference on Web and Social Media (ICWSM). https://www.microsoft.com/en-us/research/publication/more-than-money-correlation-among-worker-demographics-motivations-and-participation-in-online-labor-market/
Clausen, J. A. (1993). American lives: Looking back at the children of the Great Depression. Free Press.
Creswell, J. W., & Plano Clark, V. L. (2011). Designing and conducting mixed methods research. SAGE Publications; /z-wcorg/.
d’Eon, G., Goh, J., Larson, K., & Law, E. (2019). Paying crowd workers for collaborative work. Proceedings of the ACM on Human-Computer Interaction, 3(CSCW), 1–24. https://doi.org/10.1145/3359227
Dalle, J. M., Besten, M., Martínez, C., & Maraut, S. (2017). Microwork platforms as enablers to new ecosystems and business models: The challenge of managing difficult tasks. International Journal of Technology Management, 75, 55–72.
Davis, K., Drey, N., & Gould, D. (2009). What are scoping studies? A review of the nursing literature. International Journal of Nursing Studies, 46(10), 1386–1400. https://doi.org/10.1016/j.ijnurstu.2009.02.010
D’Cruz, P. (2017). Partially empowering but not decent? The contradictions of online labour markets. In E. Noronha & P. D’Cruz (Eds.), Critical Perspectives on Work and Employment in Globalizing India (pp. 173–195). Springer Singapore. https://doi.org/10.1007/978-981-10-3491-6_10
D’Cruz, P., & Noronha, E. (2016). Positives outweighing negatives: The experiences of Indian crowdsourced workers. Work Organisation, Labour & Globalisation, 10(1), 44–63. https://doi.org/10.13169/workorgalaboglob.10.1.0044
Deng, X., & Joshi, K. D. (2016). Why individuals participate in micro-task crowdsourcing work environment: Revealing crowdworkers’ perceptions. Journal of the Association for Information Systems, 17(10), 711 – 736. https://doi.org/10.17705/1jais.00441
Difallah, D. E., Catasta, M., Demartini, G., Ipeirotis, P. G., & Cudré-Mauroux, P. (2015). The dynamics of micro-task crowdsourcing: The case of Amazon MTurk. Proceedings of the 24th International Conference on World Wide Web, 238–247.
Difallah, D. E., Filatova, E., & Ipeirotis, P. (2018). Demographics and dynamics of Mechanical Turk workers. Proceedings of the Eleventh ACM International Conference on Web Search and Data Mining, 135–143.
Elder, G. H. (1994). Time, human agency, and social change: Perspectives on the life course. Social Psychology Quarterly, 57, 4–15.
Elder, G. H. (1995). The life course paradigm: Social change and individual development. In P. Moen, G. H. Elder, & K. Lüscher (Eds.), Examining Lives in Context: Perspectives on the Ecology of Human Development (pp. 101–139). American Psychological Association.
Elder, G. H. (1998). The life course as developmental theory. Child Development, 69, 1–12.
Elder, G. H., & Giele, J. Z. (Eds.). (2009). The craft of life course research. Guilford Press.
Feldman, M., Juldaschewa, F., & Bernstein, A. (2017). Data analytics on online labor markets: Opportunities and challenges. ArXiv, Abs/1707.01790.
Fieseler, C., Bucher, E., & Hoffmann, C. P. (2019). Unfairness by design? The perceived fairness of digital labor on crowdworking platforms. Journal of Business Ethics, 156, 987–1005. https://doi.org/10.1007/s10551-017-3607-2
Follmer, D. J., Sperling, R. A., & Suen, H. K. (2017). The role of MTurk in education research: Advantages, issues, and future directions. Educational Researcher, 46(6), 329–334. https://doi.org/10.3102/0013189X17725519
Fontana, R. P., Milligan, C., Littlejohn, A., & Margaryan, A. (2015). Measuring self‐regulated learning in the workplace. International Journal of Training and Development, 19(1), 32–52. https://doi.org/10.1111/ijtd.12046
Freedman, D., Thornton, A., Camburn, D., Alwin, D., & Young-DeMarco, L. (1988). The life history calendar: A technique for collecting retrospective data (Vol. 18). https://doi.org/10.2307/271044
Gadiraju, U. (2015). “Make hay while the crowd shines: Towards effective crowdsourcing on the web” by Ujwal Gadiraju, with Prateek Jain as coordinator. https://doi.org/10.1145/2956573.2956576
Gadiraju, U., Checco, A., Gupta, N., & Demartini, G. (2017). Modus operandi of crowd workers: The invisible role of microtask work environments. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., 1(3), 1–29. https://doi.org/10.1145/3130914
Gadiraju, U., & Demartini, G. (2018). “We regret to inform you”– Managing reactions to rejection in crowdsourcing.
Gadiraju, U., & Demartini, G. (2019). Understanding worker moods and reactions to rejection in crowdsourcing. Proceedings of the 30th ACM Conference on Hypertext and Social Media - HT ’19, 211–220. https://doi.org/10.1145/3342220.3343644
Gadiraju, U., Fetahu, B., Kawase, R., Siehndel, P., & Dietze, S. (2016). Using Worker Self-Assessments for Competence-Based Pre-Selection in Crowdsourcing Microtasks. ACM Trans. Comput.-Hum. Interact., 24(4). https://doi.org/10.1145/3119930
Gadiraju, U., Kawase, R., & Dietze, S. (2014). A taxonomy of microtasks on the web. Proceedings of the 25th ACM Conference on Hypertext and Social Media, 218–223. https://doi.org/10.1145/2631775.2631819
Gadiraju, U., Kawase, R., Dietze, S., & Demartini, G. (2015). Understanding malicious behavior in crowdsourcing platforms: The case of online surveys. Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, 1631–1640.
Gerber, C., & Krzywdzinski, M. (2019). Brave new digital work? New forms of performance control in crowdwork. In S. P. Vallas & A. Kovalainen (Eds.), Work and Labor in the Digital Age (Vol. 33, pp. 121–143). Emerald Publishing Limited. https://doi.org/10.1108/S0277-283320190000033008
Giele, J. Z., & Elder, G. H. (1998). Methods of Life Course Research: Qualitative and Quantitative Approaches. Sage Publications.
Gould, S. J. J., Cox, A. L., & Brumby, D. P. (2016). Diminished control in crowdsourcing: An investigation of crowdworker multitasking behavior. ACM Trans. Comput.-Hum. Interact., 23(3), 1–29.
Graham, M., Hjorth, I., & Lehdonvirta, V. (2017). Digital labour and development: Impacts of global digital labour platforms and the gig economy on worker livelihoods. Transfer: European Review of Labour and Research, 23(2), 135–162. https://doi.org/10.1177/1024258916687250
Gray, M. L., Suri, S., Ali, S. S., & Kulkarni, D. (2016). The crowd is a collaborative network. Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing, 134–147.
Gupta, N. (2017). An ethnographic study of crowdwork via Amazon Mechanical Turk in India [PhD Thesis, University of Nottingham]. http://eprints.nottingham.ac.uk/41062/
Gupta, N., Crabtree, A., Rodden, T., Martin, D., & O’Neill, J. (2014, February). Understanding Indian crowdworkers. Proceedings of 2014 CSCW Conference.
Hata, K., Krishna, R., Fei-Fei, L., & Bernstein, M. S. (2016). A glimpse far into the future: Understanding long-term crowd worker quality. Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing - CSCW ’17, 889–901. https://doi.org/10.1145/2998181.2998248
Hirth, M., Hossfeld, T., & Tran-Gia, P. (2011). Anatomy of a crowdsourcing platform—Using the example of Microworkers.com. 2011 Fifth International Conference on Innovative Mobile and Internet Services in Ubiquitous Computing, 322–329. https://doi.org/10.1109/IMIS.2011.89
Ho, C.-J., Slivkins, A., Suri, S., & Wortman Vaughan, J. (2015). Incentivizing high quality crowdwork. Proceedings of the 24th International Conference on World Wide Web - WWW ’15, 419–429. https://doi.org/10.1145/2736277.2741102
Hofmeister, H. (2010). Life Course. In S. Immerfall & G. Therborn (Eds.), Handbook of European Societies. Social Transformations in the 21st Century (pp. 385–411). Springer.
Hofmeister, H. (2015). Individualisation of the life course. International Social Science Journal, 64(213–214), 279–290. https://doi.org/10.1111/issj.12053
Hsieh, G., & Kocielnik, R. (2016). You get who you pay for: The impact of incentives on participation bias. Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing - CSCW ’16, 821–833. https://doi.org/10.1145/2818048.2819936
Huws, U., Spencer, N., & Joyce, S. (2016). Crowd work in Europe: Preliminary results from a survey in the UK, Sweden, Germany, Austria and the Netherlands. Foundation for European Progressive Studies.
Jacques, J. T., & Kristensson, P. O. (2019). Crowdworker economics in the gig economy. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems - CHI ’19, 1–10. https://doi.org/10.1145/3290605.3300621
Jain, A., Sarma, A. D., Parameswaran, A., & Widom, J. (2017). Understanding workers, developing effective tasks, and enhancing marketplace dynamics: A study of a large crowdsourcing marketplace. Proceedings of the VLDB Endowment, 10(7), 829–840.
Jiang, L., Wagner, C., & Nardi, B. (2015). Not just in it for the money: A qualitative investigation of workers’ perceived benefits of micro-task crowdsourcing. 48th Annual Hawaii International Conference on System Sciences (HICSS), 773–781.
Kässi, O., & Lehdonvirta, V. (2018). Online labour index: Measuring the online gig economy for policy and research. Technological Forecasting and Social Change, 137, 241–248. https://doi.org/10.1016/j.techfore.2018.07.056
Kaufmann, N., Schulze, T., & Veit, D. (2011). More than fun and money. Worker motivation in crowdsourcing—A study on Mechanical Turk. Proceedings of the Seventeenth Americas Conference on Information Systems, 11, 1–11.
Khanna, S., Ratan, A., Davis, J., & Thies, W. (2010). Evaluating and improving the usability of Mechanical Turk for low-income workers in India. Proceedings of the First ACM Symposium on Computing for Development - ACM DEV ’10, 1. https://doi.org/10.1145/1926180.1926195
Kinder, E., Jarrahi, M. H., & Sutherland, W. (2019). Gig platforms, tensions, alliances and ecosystems: An actor-network perspective. Proceedings of the ACM on Human-Computer Interaction, 3(CSCW), 1–26. https://doi.org/10.1145/3359314
Kingsley, S. C., Gray, M. L., & Suri, S. (2015). Accounting for market frictions and power asymmetries in online labor markets. Policy & Internet, 7(4), 383–400. https://doi.org/10.1002/poi3.111
Kost, D., Fieseler, C., & Wong, S. I. (2018). Finding meaning in a hopeless place? The construction of meaningfulness in digital microwork. Computers in Human Behavior, 82, 101–110. https://doi.org/10.1016/j.chb.2018.01.002
Kuek, S. C., Paradi-Guilford, C., Fayomi, T., Imaizumi, S., Ipeirotis, P., Pina, P., & Singh, M. (2015). The global opportunity in online outsourcing. World Bank Group.
Laub, J. H., & Sampson, R. J. (1998). Integrating Quantitative and Qualitative Data. In J. Z. Giele & G. H. Elder (Eds.), Methods of Life Course Research: Qualitative and Quantitative Approaches (pp. 213–230). Sage Publications.
Law, E., Yin, M., Goh, J., Chen, K., Terry, M. A., & Gajos, K. Z. (2016). Curiosity killed the cat, but makes crowdwork better. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, 4098–4110.
Lehdonvirta, V. (2016). Algorithms that divide and unite: Delocalisation, identity and collective action in ‘microwork.’ In J. Flecker (Ed.), Space, Place and Global Digital Work (pp. 53–80). Palgrave Macmillan UK. https://doi.org/10.1057/978-1-137-48087-3_4
Lehdonvirta, V. (2018). Flexibility in the gig economy: Managing time on three online piecework platforms. New Technology, Work and Employment, 33(1), 13–29. https://doi.org/10.1111/ntwe.12102
Levy, R. (2013). Chapter 1: Analysis of life courses—A theoretical sketch. In R. Levy & E. D. Widmer (Eds.), Gendered life courses between standarization and individualization. A European approach applied to Switzerland (pp. 13–36). LIT Verlag.
Ludec, C. L., Tubaro, P., & Casilli, A. A. (2019). How many people microwork in France? Estimating the size of a new labor force. ArXiv:1901.03889 [Econ, q-Fin]. http://arxiv.org/abs/1901.03889
Ma, X., Khansa, L., & Kim, S. S. (2018). Active community participation and crowdworking turnover: A longitudinal model and empirical test of three mechanisms. Journal of Management Information Systems, 35(4), 1154–1187. https://doi.org/10.1080/07421222.2018.1523587
Majima, Y., Nishiyama, K., Nishihara, A., & Hata, R. (2017). Conducting online behavioral research using crowdsourcing services in Japan. Frontiers in Psychology, 8, 378. https://doi.org/10.3389/fpsyg.2017.00378
Margaryan, A. (2016). Understanding crowdworkers’ learning practices. Proceedings of Internet, Policy and Politics 2016 Conference. 4th Biennial Internet, Politics and Policy Academic Conference. IPP 2016, Oxford, UK. http://ipp.oii.ox.ac.uk/sites/ipp/files/documents/FullPaper-CrowdworkerLearning-MargaryanForIPP-100816%281%29.pdf
Margaryan, A. (2019a). Workplace learning in crowdwork: Comparing microworkers’ and online freelancers’ practices. Journal of Workplace Learning, 31, 250–273.
Margaryan, A. (2019b). Comparing crowdworkers’ and conventional knowledge workers’ self-regulated learning strategies in the workplace. Human Computation: A Transdisciplinary Journal, 6(1), 83–97. https://doi.org/10.15346/hc.v6i1.5
Martin, D., Hanrahan, B. V., O’Neill, J., & Gupta, N. (2014). Being a turker. Proceedings of the 17th ACM Conference on Computer Supported Cooperative Work & Social Computing - CSCW ’14, 224–235. https://doi.org/10.1145/2531602.2531663
Martin, D., O’Neill, J., Gupta, N., & Hanrahan, B. V. (2016). Turking in a global labour market. Computer Supported Cooperative Work (CSCW), 25(1), 39–77. https://doi.org/10.1007/s10606-015-9241-6
Moen, P., Fields, V., Quick, H., & Hofmeister, H. (2000). A life-course approach to retirement and social integration. In Karl Pillemer, Phyllis Moen, Elaine Wethington, & Nina Glasgow (Eds.), Social Integration in the Second Half of Life (pp. 75–107). The Johns Hopkins Press.
Moen, P., & Hernandez, E. (2009). Social convoys: Studying linked lives in time, context, and motion. In G. H. Elder & J. Z. Giele (Eds.), The Craft of Life Course Research (pp. 258–279). The Guilford Press.
Morris, R. R., Dontcheva, M., Finkelstein, A., & Gerber, E. (2013). Affect and creative performance on crowdsourcing platforms. 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, 67–72. https://doi.org/10.1109/ACII.2013.18
Morse, J. (2010). Procedures and practice of mixed method design: Maintaining control, rigor, and complexity. In A. Tashakkori & C. Teddlie, SAGE Handbook of Mixed Methods in Social & Behavioral Research (pp. 339–352). SAGE Publications, Inc. https://doi.org/10.4135/9781506335193.n14
Mortimer, J. T., & Shanahan, M. J. (Eds.). (2003). Handbook of the life course. Springer US. https://doi.org/10.1007/b100507
Naderi, B. (2018). What determines task selection strategies? In B. Naderi (Ed.), Motivation of Workers on Microtask Crowdsourcing Platforms (pp. 45–80). Springer International Publishing. https://doi.org/10.1007/978-3-319-72700-4_5
Naderi, B., Wechsung, I., Polzehl, T., & Möller, S. (2014). Development and validation of extrinsic motivation scale for crowdsourcing micro-task platforms. Proceedings of the 2014 International ACM Workshop on Crowdsourcing for Multimedia, 31–36. https://doi.org/10.1145/2660114.2660122
Newlands, G., & Lutz, C. (2020). Crowdwork and the mobile underclass: Barriers to participation in India and the United States. New Media & Society, 1–21. https://doi.org/10.1177/1461444820901847
Paolacci, G., Chandler, J., & Ipeirotis, P. G. (2010). Running experiments on Amazon Mechanical Turk. Judgment and Decision Making, 5(5), 411–419.
Pawson, R. (2002). Evidence-based policy: In search of a method. Evaluation, 8(2), 157–181. https://doi.org/10.1177/1358902002008002512
Pilourdault, J., Amer-Yahia, S., Lee, D., & Roy, S. B. (2017). Motivation-aware task assignment in crowdsourcing. In B. Mitschang, V. Markl, S. Bress, P. Andritsos, K.-U. Sattler, & S. Orlando (Eds.), Advances in Database Technology—EDBT 2017: 20th International Conference on Extending Database Technology, Proceedings (pp. 246–257). OpenProceedings.org. https://doi.org/10.5441/002/edbt.2017.23
Pongratz, H. J. (2018). Of crowds and talents: Discursive constructions of global online labour. New Technology, Work and Employment, 33(1), 58–73. https://doi.org/10.1111/ntwe.12104
Posch, L., Bleier, A., Flöck, F., & Strohmaier, M. (2017). A cross-country comparison of crowdworker motivations. ArXiv:1711.03115.
Posch, L., Bleier, A., Flöck, F., & Strohmaier, M. (2018). Characterizing the global crowd workforce: A cross-country comparison of crowdworker demographics. ArXiv:1812.05948. http://arxiv.org/abs/1812.05948
Rand, D. G. (2012). The promise of Mechanical Turk: How online labor markets can help theorists run behavioral experiments. Journal of Theoretical Biology, 299, 172–179. https://doi.org/10.1016/j.jtbi.2011.03.004
Rani, U., & Furrer, M. (2019). On-demand digital economy: Can experience ensure work and income security for Microtask workers? Jahrbücher Für Nationalökonomie Und Statistik, 239(3), 565–597. https://doi.org/10.1515/jbnst-2018-0019
Rani, U., & Furrer, M. (2020). Digital labour platforms and new forms of flexible work in developing countries: Algorithmic management of work and workers. Competition & Change. https://doi.org/10.1177/1024529420905187
Retelny, D., Bernstein, M. S., & Valentine, M. A. (2017). No workflow can ever be enough: How crowdsourcing workflows constrain complex work. Proceedings of the ACM on Human-Computer Interaction, 1(CSCW), 1–23. https://doi.org/10.1145/3134724
Sampson, R. J., & Laub, J. H. (1993). Crime in the making: Pathways and turning points through life. Harvard University Press.
Sannon, S., & Cosley, D. (2019). Privacy, power, and invisible labor on Amazon Mechanical Turk. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems - CHI ’19, 1–12. https://doi.org/10.1145/3290605.3300512
Schörpf, P., Flecker, J., Schönauer, A., & Eichmann, H. (2017). Triangular love–hate: Management and control in creative crowdworking. New Technology, Work and Employment, 32(1), 43–58.
Scott, J. P., & Carrington, P. J. (2011). The Sage handbook of social network analysis. Sage Publications Ltd. http://methods.sagepub.com/book/the-sage-handbook-of-social-network-analysis
Smith, A. (2016). Gig work, online selling and home sharing. Pew Research Centre. http://www.pewinternet.org/2016/11/17/gig-work-online-selling-and-home-sharing/
Sodré, I., & Brasileiro, F. (2017). An analysis of the use of qualifications on the Amazon Mechanical Turk online labor market. Computer Supported Cooperative Work (CSCW), 26(4), 837–872. https://doi.org/10.1007/s10606-017-9283-z
Sutherland, W., Jarrahi, M. H., Dunn, M., & Nelson, S. B. (2019). Work precarity and gig literacies in online freelancing. Work, Employment and Society. https://doi.org/10.1177/0950017019886511
Vakharia, D., & Lease, M. (2013). Beyond AMT: An analysis of crowd work platforms. ArXiv Preprint ArXiv:1310.1672.
Valentine, M. A., Retelny, D., To, A., Rahmati, N., Doshi, T., & Bernstein, M. S. (2017). Flash organizations: Crowdsourcing complex work by structuring crowds as organizations. Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, 3523–3537. https://doi.org/10.1145/3025453.3025811
Vashistha, A., Sethi, P., & Anderson, R. (2018). BSpeak: An accessible voice-based crowdsourcing marketplace for low-income blind people. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems - CHI ’18, 1–13. https://doi.org/10.1145/3173574.3173631
Viry, G., Hofmeister, H., & Widmer, E. D. (2013). Residential trajectories in the early life course and their effects. In R. Levy & E. D. Widmer (Eds.), Gendered life courses between standarization and individualization. A European approach applied to Switzerland (pp. 141–160). LIT Verlag.
Wang, X., Zhu, H., Li, Y., Cui, Y., & Konstan, J. (2017). A community rather than a union: Understanding self-organization phenomenon on MTurk and how it impacts turkers and requesters. Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems - CHI EA ’17, 2210–2216. https://doi.org/10.1145/3027063.3053150
Whiting, M. E., Gamage, D., Gaikwad, S. (Neil) S., Gilbee, A., Goyal, S., Ballav, A., Majeti, D., Chhibber, N., Richmond-Fuller, A., Vargus, F., Sarma, T. S., Chandrakanthan, V., Moura, T., Salih, M. H., Kalejaiye, G. B. T., Ginzberg, A., Mullings, C. A., Dayan, Y., Milland, K., … Bernstein, M. S. (2017). Crowd guilds: Worker-ied reputation and feedback on crowdsourcing platforms. Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing, 1902–1913.
Williams, A. C., Mark, G., Milland, K., Lank, E., & Law, E. (2019). The perpetual work life of crowdworkers: How tooling practices increase fragmentation in crowdwork. Proceedings of the ACM on Human-Computer Interaction, 3(CSCW), 1–28. https://doi.org/10.1145/3359126
Wood, A. J., Lehdonvirta, V., & Graham, M. (2018). Workers of the internet unite? Online freelancer organisation among remote gig economy workers in six Asian and African countries. New Technology, Work and Employment, 33(2), 95–112. https://doi.org/10.1111/ntwe.12112
Yang, J., van der Valk, C., Hoßfeld, T., Redi, J., & Bozzon, A. (2018). How do crowdworker communities and Microtask markets influence each other?: A data-driven study on Amazon Mechanical Turk. 6th AAAI Conference on Human Computation and Crowdsourcing, 193–202.
Yin, M., Gray, M. L., Suri, S., & Vaughan, J. W. (2016). The communication network within the crowd. Proceedings of the 25th International Conference on World Wide Web - WWW ’16, 1293–1303. https://doi.org/10.1145/2872427.2883036
Yin, M., Suri, S., & Gray, M. L. (2018). Running out of time: The impact and value of flexibility in on-demand crowdwork. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems - CHI ’18, 1–11. https://doi.org/10.1145/3173574.3174004
Zhuang, M., & Gadiraju, U. (2019). In what mood are you today?: An analysis of crowd workers’ mood, performance and engagement. Proceedings of the 10th ACM Conference on Web Science - WebSci ’19, 373–382. https://doi.org/10.1145/3292522.3326010
Zyskowski, K., Morris, M. R., Bigham, J. P., Gray, M. L., & Kane, S. K. (2015). Accessible crowdwork?: Understanding the value in and challenge of microtask employment for people with disabilities. Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing - CSCW ’15, 1682–1693. https://doi.org/10.1145/2675133.2675158
How to Cite
Copyright (c) 2021 Human Computation
This work is licensed under a Creative Commons Attribution 4.0 International License.Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).