Designing Internet Research (Spring 2025)/Additional topics
From CommunityData
Digital Trace and Sensor Data[edit]
Required Readings:
- Müller, Jörg, Sergi Fàbregues, Elisabeth Anna Guenther, and María José Romano. 2019. “Using Sensors in Organizational Research—Clarifying Rationales and Validation Challenges for Mixed Methods.” Frontiers in Psychology 10. https://www.frontiersin.org/article/10.3389/fpsyg.2019.01188. [Available free online]
- Eagle, Nathan. 2011. “Mobile Phones as Sensors for Social Research.” In The Handbook of Emergent Technologies in Social Research, 492–521. Oxford, UK: Oxford University Press. [Available in Canvas]
- Jiang, Jie, Riccardo Pozza, Nigel Gilbert, and Klaus Moessner. 2020. “MakeSense: An IoT Testbed for Social Research of Indoor Activities.” ACM Transactions on Internet of Things 1 (3): 17:1-17:25. https://doi.org/10.1145/3381914. [Available free online]
- Note: I'm mostly thinking this is a useful example a sort of home/IoT based approach to sensors. There's a bunch of technical detail on the system here but please skip/skim the detail here.
- Greshake Tzovaras, Bastian, Misha Angrist, Kevin Arvai, Mairi Dulaney, Vero Estrada-Galiñanes, Beau Gunderson, Tim Head, et al. 2019. “Open Humans: A Platform for Participant-Centered Research and Personal Data Exploration.” GigaScience 8 (6): giz076. https://doi.org/10.1093/gigascience/giz076. [Available free online]
Optional:
- Blumenstock, Joshua, Gabriel Cadamuro, and Robert On. 2015. “Predicting Poverty and Wealth from Mobile Phone Metadata.” Science 350 (6264): 1073–76. https://doi.org/10.1126/science.aac4420. [Available from UW libraries]
- Menchen-Trevino, Ericka. 2018. “Digital Trace Data and Social Research: A Proactive Research Ethics.” Edited by Brooke Foucault Welles and Sandra González-Bailón. In The Oxford Handbook of Networked Communication. Oxford, UK: Oxford University Press. https://doi.org/10.1093/oxfordhb/9780190460518.013.25. {avail-uw|https://doi.org/10.1093/oxfordhb/9780190460518.013.25}}
- Rom, Adina, Isabel Günther, and Yael Borofsky. 2020. “Using Sensors to Measure Technology Adoption in the Social Sciences.” Development Engineering 5 (January): 100056. https://doi.org/10.1016/j.deveng.2020.100056. [Available free online]
- Steele, Jessica E., Pål Roe Sundsøy, Carla Pezzulo, Victor A. Alegana, Tomas J. Bird, Joshua Blumenstock, Johannes Bjelland, et al. 2017. “Mapping Poverty Using Mobile Phone and Satellite Data.” Journal of The Royal Society Interface 14 (127): 20160690. https://doi.org/10.1098/rsif.2016.0690. [Available free online]
- Struminskaya, Bella, Peter Lugtig, Florian Keusch, and Jan Karem Höhne. 2020. “Augmenting Surveys with Data from Sensors and Apps: Opportunities and Challenges.” Social Science Computer Review, December, 0894439320979951. https://doi.org/10.1177/0894439320979951. [Available free online]
- Wiebe, Douglas J., and Chistopher N. Morrison. 2018. “Digital Mapping of Urban Mobility Patterns.” Edited by Brooke Foucault Welles and Sandra González-Bailón. In The Oxford Handbook of Networked Communication. Oxford, UK: Oxford University Press. https://doi.org/10.1093/oxfordhb/9780190460518.013.25. [Available from UW libraries]
Crowdsourcing, Digital Labor Markets, and Human Computation [Tentative][edit]
- Note: I've marked things as [Required] below if they are required because I thought it made more sense to keep the topics groups of articles below intact.
MTurk documentation and guidelines:
- [Required] Amazon Mechanical Turk Requester UI Guide — Skim, but make sure you're ready to submit tasks.
- [Required] Amazon Mechanical Turk Best Practices Guide — Skim, but make sure you're ready to submit tasks.
- [Required] Shaw, Aaron. 2015. “Hired Hands and Dubious Guesses: Adventures in Crowdsourced Data Collection.” In Digital Research Confidential: The Secrets of Studying Behavior Online, edited by Eszter Hargittai and Christian Sandvig. The MIT Press. [Forthcoming]
- [Required] Tutorials Posted on the MTurk blog — Skim and browse and pay attention to things that are like what you'd like to do in the class session.
- [Required] Guidelines for Academic Requesters and Basics of How to be a good Requester from the We Are Dynamo — These sets of guidelines were created by Turkers as part of an effort to engage in collective actions and organizer of Turkers run by Niloufar Saleh in the paper below.
- Mason, Winter, and Siddharth Suri. 2011. “Conducting Behavioral Research on Amazon’s Mechanical Turk.” Behavior Research Methods 44 (1): 1–23. https://doi.org/10.3758/s13428-011-0124-6. [Available from UW libraries] — Dated but still somewhat useful.
Overviews of MTurk and issues of data quality:
- Horton, John J., David G. Rand, and Richard J. Zeckhauser. 2011. “The Online Laboratory: Conducting Experiments in a Real Labor Market.” Experimental Economics 14 (3): 399–425. https://doi.org/10.1007/s10683-011-9273-9. [Available from UW libraries]
- Buhrmester, Michael, Tracy Kwang, and Samuel D. Gosling. 2011. “Amazon’s Mechanical Turk: A New Source of Inexpensive, yet High-Quality, Data?” Perspectives on Psychological Science, February. https://doi.org/10.1177/1745691610393980. [Available from UW libraries]
- Casler, Krista, Lydia Bickel, and Elizabeth Hackett. 2013. “Separate but Equal? A Comparison of Participants and Data Gathered via Amazon’s MTurk, Social Media, and Face-to-Face Behavioral Testing.” Computers in Human Behavior 29 (6): 2156–60. https://doi.org/10.1016/j.chb.2013.05.009. [Available from UW libraries]
- [Required] Weinberg, Jill, Jeremy Freese, and David McElhattan. 2014. “Comparing Data Characteristics and Results of an Online Factorial Survey between a Population-Based and a Crowdsource-Recruited Sample.” Sociological Science 1: 292–310. https://doi.org/10.15195/v1.a19. [Available free online]
- Kees, Jeremy, Christopher Berry, Scot Burton, and Kim Sheehan. 2017. “An Analysis of Data Quality: Professional Panels, Student Subject Pools, and Amazon’s Mechanical Turk.” Journal of Advertising 46 (1): 141–55. https://doi.org/10.1080/00913367.2016.1269304. [Available from UW libraries]
- [Required] Kennedy, Ryan, Scott Clifford, Tyler Burleigh, Ryan Jewell, and Philip Waggoner. 2018. “The Shape of and Solutions to the MTurk Quality Crisis.” SSRN Scholarly Paper ID 3272468. Rochester, NY: Social Science Research Network. https://papers.ssrn.com/abstract=3272468. [[{{{1}}} Available free online]]
Culture and work conditions for Turkers:
- Irani, Lilly. 2015. “The Cultural Work of Microwork.” New Media & Society 17 (5): 720–39. https://doi.org/10.1177/1461444813511926. [Available from UW libraries]
- Kittur, Aniket, Jeffrey V. Nickerson, Michael Bernstein, Elizabeth Gerber, Aaron Shaw, John Zimmerman, Matt Lease, and John Horton. 2013. “The Future of Crowd Work.” In Proceedings of the 2013 Conference on Computer Supported Cooperative Work, 1301–1318. CSCW ’13. San Antonio, Texas, USA: Association for Computing Machinery. https://doi.org/10.1145/2441776.2441923. [Available from UW libraries] [Available free online]
- Gray, Mary L., Siddharth Suri, Syed Shoaib Ali, and Deepti Kulkarni. 2016. “The Crowd Is a Collaborative Network.” In Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing, 134–147. CSCW ’16. San Francisco, California, USA: Association for Computing Machinery. https://doi.org/10.1145/2818048.2819942. [Available from UW libraries]
- [Required] Semuels, Alana. 2018. “The Internet Is Enabling a New Kind of Poorly Paid Hell.” The Atlantic. January 23, 2018. https://www.theatlantic.com/business/archive/2018/01/amazon-mechanical-turk/551192/. [Available free online]
Systems to approve Turker experiences:
- Salehi, Niloufar, Lilly C. Irani, Michael S. Bernstein, Ali Alkhatib, Eva Ogbe, Kristy Milland, and Clickhappier. 2015. “We Are Dynamo: Overcoming Stalling and Friction in Collective Action for Crowd Workers.” In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, 1621–1630. CHI ’15. Seoul, Republic of Korea: Association for Computing Machinery. https://doi.org/10.1145/2702123.2702508. [Available from UW libraries]
- Irani, Lilly C., and M. Six Silberman. 2013. “Turkopticon: Interrupting Worker Invisibility in Amazon Mechanical Turk.” In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 611–620. CHI ’13. Paris, France: Association for Computing Machinery. https://doi.org/10.1145/2470654.2470742. [Available from UW libraries]
Assignments to complete before class:
The first task is to complete a task a crowd worker:
- If you are a US citizen: Sign up as a worker on MTurk. Find and complete at least 5 "hits" as a worker on Amazon Mechanical Turk. Note that to do this you will need to create a worker account on Mturk.
- If you are not a US citizen or if you cannot sign up on MTurk for some other reason: Complete at least 3-4 classification tasks in at least 2 different Zooniverse projects of your choice. Also, complete at least one "study" in Lab in the Wild
- In either case: Record (write down) details and notes about your tasks: What did you do? Who was the requester? What could you was the purpose of the task (as best you could tell)? What was the experience like? What research applications can you (not) imagine for this kind of system?
The second task is to get ready to launch a task as a requestor. We will design and launch tasks in class but I want you to do the following ahead of time:
- Create a "requester" account on Amazon Mechnical Turk. Doing so may require up top 48 hours to be approved so please do that immediately so you have it ready to go in class.
- Put money onto your requestor account to pay workers. A $5 budget should be sufficient for our class. They should take any payment that Amazon does.
- Think of at least one small classification or coding task (e.g., of Tweets, images, etc) and one human subjects data collection tasks like a survey, a survey experiment, etc, that you would like to run. You will have a budget of $5 to run the task!
- If running this task will involve some data (e.g., a set of images or URLs, a set of Tweets, etc), collect that material in a spreadsheet before class. If it will involve a survey, create your survey in a Google Form and/or a Survey Monkey or Qualtrics survey before class.
Hyperlink Networks [Tentative][edit]
- Park, Han Woo. 2003. “Hyperlink Network Analysis: A New Method for the Study of Social Structure on the Web.” Connections 25 (1): 49–61. [Available through Canvas]
- González-Bailón, Sandra. 2009. “Opening the Black Box of Link Formation: Social Factors Underlying the Structure of the Web.” Social Networks 31 (4): 271–80. https://doi.org/10.1016/j.socnet.2009.07.003. [Available from UW libraries]
- [Example] Elgin, Dallas J. 2015. “Utilizing Hyperlink Network Analysis to Examine Climate Change Supporters and Opponents.” Review of Policy Research 32 (2): 226–45. https://doi.org/10.1111/ropr.12118. [Available from UW libraries]
Optional readings:
- Jackson, Michele H. 1997. “Assessing the Structure of Communication on the World Wide Web.” Journal of Computer-Mediated Communication 3 (1). https://doi.org/10.1111/j.1083-6101.1997.tb00063.x. [Available free online]
- Ackland, Robert. 2016. “WWW Hyperlink Networks.” Lecture Slides presented at the SOCR8006 Online Research Methods, Canberra, Australia. http://vosonlab.net/papers/ACSPRISummer2017/Lecture_Hyperlink_Networks.pdf. [Available free online]
- Lusher, Dean, and Robert Ackland. 2011. “A Relational Hyperlink Analysis of an Online Social Movement.” Journal of Social Software 12 (5). https://www.cmu.edu/joss/content/articles/volume12/Lusher/. [Available free online]
- [Example] Shumate, Michelle, and Lori Dewitt. 2008. “The North/South Divide in NGO Hyperlink Networks.” Journal of Computer-Mediated Communication 13 (2): 405–28. https://doi.org/10.1111/j.1083-6101.2008.00402.x. [Available free online]
Tools for collecting hyperlink network data:
- Issue Crawler — network mapping software by the Govcom.org Foundation, Amsterdam in a group run by Richard Rogers
- Virtual Observatory for the Study of Online Networks (VOSON) — "web-based software incorporating web mining, data visualisation, and traditional empirical social science methods (e.g. social network analysis, SNA). Text analysis, dataset manipulation and visualisation, and social network analysis (SNA) are available within an integrated environment."