Disentangling the Effects of Social Signals

Tad Hogg, Kristina Lerman

Abstract


Peer recommendation is a crowdsourcing task that leverages the opinions of many to identify interesting content online, such as news, images, or videos. Peer recommendation applications often use social signals, e.g., the number of prior recommendations, to guide people to the more interesting content. How people react to social signals, in combination with content quality and its presentation order, determines the outcomes of peer recommendation, i.e., item popularity. Using Amazon Mechanical Turk, we experimentally measure the effects of social signals in peer recommendation. Specifically, after controlling for variation due to item content and its position, we find that social signals affect item popularity about half as much as position and content do. These effects are somewhat correlated, so social signals exacerbate the ``rich get richer'' phenomenon, which results in a wider variance of popularity. Further, social signals change individual preferences, creating a ``herding'' effect that biases people's judgments about the content. Despite this, we find that social signals improve the efficiency of peer recommendation by reducing the effort devoted to evaluating content while maintaining recommendation quality.


Keywords


social influence, crowdsourcing, peer recommendation

Full Text:

PDF

References


Abeliuk, A, Berbeglia, G, Cebrian, M, and Van Hentenryck, P. (2015). The benefits of social influence in optimized cultural markets. PloS one 10, 4 (2015).

Abu-Mostafa, Y. S, Magdon-Ismail, M, and Lin, H.-T. (2012). Learning From Data. AMLBook.

Bohannon, J. (2011). Social Science for Pennies. Science 334 (2011), 307.

Bond, R. M, Fariss, C. J, Jones, J. J, Kramer, A. D. I, Marlow, C, Settle, J. E, and Fowler, J. H. (2012). A 61-million-person experiment in social influence and political mobilization. Nature 489, 7415 (2012), 295–298.

Buscher, G, Cutrell, E, and Morris, M. R. (2009). What do you see when you’re surfing?: using eye tracking to predict salient regions

of web pages. In Proc. the 27th Int. Conf. on Human factors in computing systems. New York, NY, USA, 21–30.

Craswell, N, Zoeter, O, Taylor, M, and Ramsey, B. (2008). An experimental comparison of click position-bias models. In Proceedings of the international conference on Web search and web data mining (WSDM ’08). 87–94.

Crump, M. J. C, McDonnell, J. V, and Gureckis, T. M. (2013). Evaluating Amazon’s Mechanical Turk as a Tool for Experimental

Behavioral Research. PLos ONE 8 (2013), e57410.

Herbst, D and Mas, A. (2015). Peer effects on worker output in the laboratory generalize to the field. Science 350 (2015), 545–549.

Hogg, T and Lerman, K. (2012). Social Dynamics of Digg. EPJ Data Science 1, 5 (June 2012).

Huberman, B. A. (1998). Strong Regularities in World Wide Web Surfing. Science 280, 5360 (April 1998), 95–97.

Katz, E and Lazarsfeld, P. F. (1955). Personal influence: The part played by people in the flow of mass communications. The Free Press, New York.

Kittur, A, Nickerson, J. V, Bernstein, M, Gerber, E, Shaw, A, Zimmerman, J, Lease, M, and Horton, J. (2013). The Future of Crowd Work. In Proceedings of the 2013 Conference on Computer Supported Cooperative Work (CSCW ’13). ACM, New York, NY, USA, 1301–1318.

Koren, Y, Bell, R, and Volinsky, C. (2009). Matrix Factorization Techniques for Recommender Systems. Computer 42, 8 (2009), 30–37.

Krumme, C, Cebrian, M, Pickard, G, and Pentland, S. (2012). Quantifying Social Influence in an Online Cultural Market. PLoS ONE 7, 5 (2012), e33785.

Lerman, K and Hogg, T. (2014). Leveraging Position Bias to Improve Peer Recommendation. PLoS ONE 9, 6 (2014), e98914.

Lorenz, J, Rauhut, H, Schweitzer, F, and Helbing, D. (2011). How social influence can undermine the wisdom of crowd effect. Proceedings of the National Academy of Sciences 108, 22 (2011), 9020–9025.

Mason, W and Suri, S. (2012). Conducting behavioral research on Amazon’s Mechanical Turk. Behavior Research Methods 44 (2012),

–23.

Muchnik, L, Aral, S, and Taylor, S. J. (2013). Social Influence Bias: A Randomized Experiment. Science 341, 6146 (2013), 647–651.

Payne, S. L. (1951). The Art of Asking Questions. Princeton University Press.

Ratkiewicz, J, Fortunato, S, Flammini, A, Menczer, F, and Vespignani, A. (2010). Characterizing and Modeling the Dynamics of Online Popularity. Physical Review Letters 105, 15 (2010), 158701+.

Rogers, E. M. (2003). Diffusion of Innovations, 5th Edition (5th ed.). Free Press.

Salganik, M. J, Dodds, P. S, and Watts, D. J. (2006). Experimental Study of Inequality and Unpredictability in an Artificial Cultural

Market. Science 311, 5762 (2006), 854–856.

Salganik, M. J and Watts, D. J. (2008). Leading the Herd Astray: An Experimental Study of Self-fulfilling Prophecies in an Artificial

Cultural Market. Social Psychology Quarterly 71, 4 (1 Dec. 2008), 338–355.

Salganik, M. J andWatts, D. J. (2009). Web-Based Experiments for the Study of Collective Social Dynamics in Cultural Markets. Topics

in Cognitive Science 1, 3 (2009), 439–468.

Stoddard, G. (2015). Popularity Dynamics and Intrinsic Quality in Reddit and Hacker News. In Ninth Intl. AAAI Conf. of Web and Social Media (ICWSM 2015).

Wang, D, Song, C, and Barabási, A.-L. (2013). Quantifying Long-Term Scientific Impact. Science 342, 6154 (2013), 127–132.

Wang, T, Wang, D, and Wang, F. (2014). Quantifying Herding Effects in Crowd Wisdom. In Proceedings of the 20th ACM SIGKDD

International Conference on Knowledge Discovery and Data Mining (KDD ’14). ACM, New York, NY, USA, 1087–1096.


Refbacks

  • There are currently no refbacks.