Designing Internet Research (Spring 2025)/Additional topics

From CommunityData

Digital Trace and Sensor Data[edit]

Required Readings:

  • Müller, Jörg, Sergi Fàbregues, Elisabeth Anna Guenther, and María José Romano. 2019. “Using Sensors in Organizational Research—Clarifying Rationales and Validation Challenges for Mixed Methods.” Frontiers in Psychology 10. https://www.frontiersin.org/article/10.3389/fpsyg.2019.01188. [Available free online]
  • Eagle, Nathan. 2011. “Mobile Phones as Sensors for Social Research.” In The Handbook of Emergent Technologies in Social Research, 492–521. Oxford, UK: Oxford University Press. [Available in Canvas]
  • Jiang, Jie, Riccardo Pozza, Nigel Gilbert, and Klaus Moessner. 2020. “MakeSense: An IoT Testbed for Social Research of Indoor Activities.” ACM Transactions on Internet of Things 1 (3): 17:1-17:25. https://doi.org/10.1145/3381914. [Available free online]
Note: I'm mostly thinking this is a useful example a sort of home/IoT based approach to sensors. There's a bunch of technical detail on the system here but please skip/skim the detail here.
  • Greshake Tzovaras, Bastian, Misha Angrist, Kevin Arvai, Mairi Dulaney, Vero Estrada-Galiñanes, Beau Gunderson, Tim Head, et al. 2019. “Open Humans: A Platform for Participant-Centered Research and Personal Data Exploration.” GigaScience 8 (6): giz076. https://doi.org/10.1093/gigascience/giz076. [Available free online]

Optional:

Crowdsourcing, Digital Labor Markets, and Human Computation [Tentative][edit]

Note: I've marked things as [Required] below if they are required because I thought it made more sense to keep the topics groups of articles below intact.

MTurk documentation and guidelines:

Overviews of MTurk and issues of data quality:

Culture and work conditions for Turkers:

Systems to approve Turker experiences:

  • Salehi, Niloufar, Lilly C. Irani, Michael S. Bernstein, Ali Alkhatib, Eva Ogbe, Kristy Milland, and Clickhappier. 2015. “We Are Dynamo: Overcoming Stalling and Friction in Collective Action for Crowd Workers.” In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, 1621–1630. CHI ’15. Seoul, Republic of Korea: Association for Computing Machinery. https://doi.org/10.1145/2702123.2702508. [Available from UW libraries]
  • Irani, Lilly C., and M. Six Silberman. 2013. “Turkopticon: Interrupting Worker Invisibility in Amazon Mechanical Turk.” In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 611–620. CHI ’13. Paris, France: Association for Computing Machinery. https://doi.org/10.1145/2470654.2470742. [Available from UW libraries]

Assignments to complete before class:

The first task is to complete a task a crowd worker:

  • If you are a US citizen: Sign up as a worker on MTurk. Find and complete at least 5 "hits" as a worker on Amazon Mechanical Turk. Note that to do this you will need to create a worker account on Mturk.
  • If you are not a US citizen or if you cannot sign up on MTurk for some other reason: Complete at least 3-4 classification tasks in at least 2 different Zooniverse projects of your choice. Also, complete at least one "study" in Lab in the Wild
  • In either case: Record (write down) details and notes about your tasks: What did you do? Who was the requester? What could you was the purpose of the task (as best you could tell)? What was the experience like? What research applications can you (not) imagine for this kind of system?

The second task is to get ready to launch a task as a requestor. We will design and launch tasks in class but I want you to do the following ahead of time:

  • Create a "requester" account on Amazon Mechnical Turk. Doing so may require up top 48 hours to be approved so please do that immediately so you have it ready to go in class.
  • Put money onto your requestor account to pay workers. A $5 budget should be sufficient for our class. They should take any payment that Amazon does.
  • Think of at least one small classification or coding task (e.g., of Tweets, images, etc) and one human subjects data collection tasks like a survey, a survey experiment, etc, that you would like to run. You will have a budget of $5 to run the task!
  • If running this task will involve some data (e.g., a set of images or URLs, a set of Tweets, etc), collect that material in a spreadsheet before class. If it will involve a survey, create your survey in a Google Form and/or a Survey Monkey or Qualtrics survey before class.

Hyperlink Networks [Tentative][edit]

Optional readings:

Tools for collecting hyperlink network data:

  • Issue Crawler — network mapping software by the Govcom.org Foundation, Amsterdam in a group run by Richard Rogers
  • Virtual Observatory for the Study of Online Networks (VOSON) — "web-based software incorporating web mining, data visualisation, and traditional empirical social science methods (e.g. social network analysis, SNA). Text analysis, dataset manipulation and visualisation, and social network analysis (SNA) are available within an integrated environment."