Designing Internet Research (Winter 2020)


 * Designing Internet Research
 * COM528 - Department of Communication, University of Washington
 * Instructor: Benjamin Mako Hill (University of Washington)
 * Course Websites:
 * We will use Canvas for announcements, turning in assignments, and discussion
 * Everything else will be linked on this page.
 * Course Catalog Description:


 * Focuses on designing Internet research, assessing the adaptation of proven methods to Internet tools and environments, and developing new methods in view of particular capacities and characteristics of Internet applications. Legal and ethical aspects of Internet research receive ongoing consideration.

Overview and Learning Objectives
What new lines of inquiry and approaches to social research are made possible and necessary by the Internet? In what ways have established research methods been affected by the Internet? How does the Internet challenge established methods of social research? How are researchers responding to these challenges?

These are some of the key questions we will explore in this course. The course will focus on assessing the incorporation of Internet tools in established and emergent methods of social research, the adaptation of social research methods to study online phenomena, and the development of new methods and tools that correspond with the particular capacities and characteristics of the Internet. The readings will include both descriptions of Internet-related research methods with an eye to introducing skills and examples of studies that use them. The legal and ethical aspects of Internet research will receive ongoing consideration throughout the course. The purpose of this course is to help prepare students to design high quality research projects that use the Internet to study online communicative, social, cultural, and political phenomena.

I will consider the course a complete success if every student is able to do all of these things at the end of the quarter:


 * Discuss and compare distinct types of Internet research including: web archiving; textual analysis; ethnography; interviews; network analyses of social and hyperlink networks; analysis of digital trace data, traditional, natural, and field experiments; design research; interviewing; survey research; and narrative and visual analyses.
 * Describe particular challenges and threats to research validity associated with each method.
 * For at least one method, be able to provide a detailed description of a research project and feel comfortable embarking on a formative study using this methodology.
 * Given a manuscript (e.g., in the context of a request for peer review), be able to evaluate a Internet-based study in terms of its use its methodological choices.
 * Use a modern programming language (like Python) to collect a dataset from a web API like those published by Twitter, Reddit, or Wikipedia.

Note About This Syllabus
You should expect this syllabus to be a dynamic document. Although the core expectations for this class are fixed, the details of readings and assignments will shift based on how the class goes, guest speakers that I arrange, my own readings in this area, etc. As a result, there are three important things to keep in mind:


 * Although details on this syllabus will change, I will try to ensure that I never change readings more than six days before they are due. I will send an announcement no later than before each Wednesday evening that fixes the schedule for the next week. This means that if I don't fill in a reading marked "" six days before it's due, it is dropped. If we don't change something marked "" before the deadline, then it is assigned. This means that if you plan to read more than six days ahead, contact the teaching team first.
 * Because this syllabus a wiki, you will be able to track every change by clicking the history button on this page when I make changes. I will summarize these changes in the weekly an announcement on Canvas sent that will be emailed to everybody in the class. Closely monitor your email or the announcements section on the course website on Canvas to make sure you don't miss these!
 * I will ask the class for voluntary anonymous feedback frequently — especially toward the beginning of the quarter. Please let me know what is working and what can be improved. In the past, I have made many adjustments to courses that I teach while the quarter progressed based on this feedback.

Books
This class has no textbook and I am not requiring you to buy any books for this class. That said, several required readings and many suggested readings, will come from several excellent books which you might want to consider adding to your library:

These books include:


 * Burgess, Jean, Alice Marwick, and Thomas Poell, eds. 2018. The SAGE Handbook of Social Media. London, UK: SAGE. [Available through UW libraries]
 * Foucault Welles, Brooke, and Sandra González-Bailón, eds. 2018. The Oxford Handbook of Networked Communication. London, UK: Oxford University Press. [Available through UW libraries]
 * Hargittai, Eszter, and Christian Sandvig, eds. 2015. Digital Research Confidential: The Secrets of Studying Behavior Online. MIT Press. [Available through UW libraries
 * Hesse-Biber, Sharlene Nagy, ed. 2011. The Handbook of Emergent Technologies in Social Research. Oxford, UK: Oxford University Press.
 * Hewson, Claire, Carl Vogel, and Dianna Laurent. 2016. Internet Research Methods. London, UK: SAGE. https://doi.org/10.4135/9781473920804. [Available through UW libraries]
 * Snee, Helene, Christine Hine, Yvette Morey, Steven Roberts, and Hayley Watson, eds. 2016. Digital Methods for Social Science: An Interdisciplinary Guide to Research Innovation. New York, NY: Palgrave-Macmillan. [Available through UW libraries]

Technical Skills
Nearly all of our structured in-person meetings and all of our readings will focus on teaching conceptual skills related to Internet research. These skills involve the "softer" skills of understanding, designing, and critiquing research plans. These are harder to teach, evaluate, and learn but are ultimately what will make a research project interesting, useful, or valid. When the course has been taught in the past by other faculty, it has been entirely focused on these types of conceptual skills.

That said, I also believe that any skilled Internet researcher must be comfortable writing code to collect a dataset from the web or, at the very least, should have enough experience doing so that they know what is involved and what is possible and impossible. This is essential even if your only goals is to manage somebody else writing code and gathering data or work productively with a collaborator who is doing so. As a result, being successful in this class will also require technical skills.

Because students are going to come to the class with different technical skillsets, we well be devoting a relatively small chunk of class time to developing technical skills. Instead, I'm requiring that students build these skills outside of our meetings together if they do not have them already.

In particular, I want every student to have the following three things:


 * 1) Basic skills in a general purpose high-level programming language used for Internet-based data collection and analysis. I strongly recommend the Python programming language although other programming languages like Ruby and Perl are also good choices. Generally speaking, statistical programming languages like R, Stata, Matlab are not well suited for this.
 * 2) Familiarity with the technologies of web APIs. In particular, students should understand what APIs are, how they work, and should be able to read, interpret, and process data in JSON.
 * 3) Knowledge of how to process and move data from a website or API into a format that they will be able to use for analysis. The final format will depend on the nature of the result but this might be a statistical programming environment like R, Stata, SAS, SPSS, etc or a qualitative data analysis tools like ATLAS.ti, NVivo or Dedoose.

If you are already comfortable doing these things, great.

If you are not yet comfortable, I am going to be organizing three free workshops called the Community Data Science Workshops on Saturdays in April and May and I extremely strongly recommend that you attend them. The workshops will teach exactly the skills I'm expecting you to have and attending the full series of workshops will be enough to fulfill this requirement.

The workshops will meet four times so please block these out on your calendar now:


 * 1) Friday January 17 6-9pm
 * 2) Saturday January 18 9:45am-4pm
 * 3) Saturday February 1 9:45am-4pm
 * 4) Saturday February 15 9:45am-4pm

I've offered this workshops four times previously and they have always been oversubscribed. As a result, you should register for these workshops immediately. You can find the registration link on this page. Please mention that you are in this class when you register so that we make sure that you accept your application.

I have taught these workshops five times before in 2014, 2015, and 2016. If you have taken them in the past, you do not need to take them again. If you took them before but are feeling unsure about your skills, you will be welcome to come back to review and brush up on the material.

If you do not have the technical skills required above and you will not attend the workshops, you're going to be responsible for learning this material on your own. Although this is totally fine, I suspect it present a major challenge to success in this class. If you will be in this situation, contact me before the quarter starts.

Assignments
The assignments in this class are designed to give you an opportunity to try your hand at using the conceptual material taught in the class. There will be no exams or quizzes. Unless otherwise noted, all assignments are due at the end of the day (i.e., 11:59pm on the day they are due).

Weekly Reflections

 * Deliverables: (1) Post a message in the appropriate course discussion board; (2) Respond to at least one of your classmates before class.
 * Due Date: (1) Every Monday (on a week with reading); (2) Every Tuesday at 1:30 (on a week with reading)
 * Maximum length: 1,000 words

For every week that we have readings (i.e., every week except for the consulting weeks and and the final presentation weeks), I'm asking everybody to reflect on the readings by the day before class and to share their reflections with everybody else. Because we're skipping the first week, that works out to a total of six reflections.

Reflections should be no more than 1000 words (about one single-spaced page). So everyone will have a chance to read the reflections before class, response papers should be posted to our course website the day before (i.e., before midnight each Monday) so that we can all read and construct responses. Please also pose one or two open-ended discussion questions that may serve as jumping off points for our in-class conversation. Don't bother with summarizing (we've all done the reading after all) and focus on engaging with ideas.

In terms of content, response papers offer you an opportunity to engage the readings by identifying common or conflicting premises, thinking through potential implications, offering political or cultural examples, posing well-supported objections, or outlining critical extensions. In my experience, the most thought provoking reflections go beyond pointing out things that one wonders about or finds interesting and explain why you find it interesting. Turn in your response paper to Canvas by posting a new message in the appropriate day in the course the discussion board.

I'd also like everybody read over everybody else's responses and respond to at least one person—evening things out so that not everybody response to one person would be nice, but use your judgement.

Research Project
As a demonstration of your learning in this course, you will design a plan for an internet research project and will, if possible, also collect (at least) an initial sample of a dataset that you will use to complete the project.

The genre of the paper you can produce can take one of the following three forms:


 * 1) A draft of a manuscript for submission to a conference or journal.
 * 2) A proposal for funding (e.g., for submission for the NSF for a graduate student fellowship).
 * 3) A draft of the methods chapter of your dissertation.

In any the three paths, I expect you take this opportunity to produce a document that will further your to academic career outside of the class.

Project Identification

 * Due Date: January 24
 * Maximum paper length: 800 words (~3 pages)
 * Deliverables: Turn in [in the appropriate Canvas dropbox]

Early on, I want you to identify your final project. Your proposal should be short and can be either paragraphs or bullets. It should include the following things:


 * The genre of the project and a short description of how it fits into your career trajectory.
 * A one paragraph abstract of the proposed study and research question, theory, community, and/or groups you plan to study.
 * A short description of the type of data you plan to collect as part of your final project.

Final Project

 * Outline Due Date: February 23
 * Maximum outline length: 2 pages
 * Presentation Date: March 10
 * Paper Due Date: March 20
 * Maximum outline length: 8000 words (~27 pages)
 * All Deliverables: Turn in in the appropriate Canvas dropboxes

Because the emphasis in this class is on methods and because I'm not an expert in each of your areas or fields, I'm happy to assume that your paper, proposal, or thesis chapter has already established the relevance and significance of your study and has a comprehensive literature review, well-grounded conceptual approach, and compelling reason why this research is so important. Instead of providing all of this details, instead feel free to start with a brief summary of the purpose and importance of this research, and an introduction of your research questions or hypotheses. If your provide more detail, that's fine, but I won't give you detailed feedback on this parts.

The final paper should include:


 * a statement of the purpose, central focus, relevance and significance of this research;
 * a description of the specific Internet application(s) and/or environment(s) and/or objects to be studied and employed in the research;
 * key research questions or hypotheses;
 * operationalization of key concepts;
 * a description and rationale of the specific method(s), (if more than one method will be used, explain how the methods will produce complementary findings);
 * a description of the step-by-step plan for data collection;
 * description and rationale of the level(s), unit(s) and process of analysis (if more than one kind of data are generated, explain how each kind will be analyzed individually and/or comparatively);
 * an explanation of how these analyses will enable you to answer the RQs
 * a sample instrument (as appropriate);
 * a sample dataset and description of a formative analysis you have completed;
 * a description of actual or anticipated results and any potential problems with their interpretation;
 * a plan for publishing/disseminating the findings from this research
 * a summary of technical, ethical, human subjects and legal issues that may be encountered in this research, and how you will address them;
 * a schedule (using specific dates) and proposed budget.

I also expect each student to begin data collection for your project (i.e., using the technical skills you learn in the class) and describe your progress in this regard this in your paper. If collecting data for a proposed project is impractical (e.g., because of IRB applications, funding, etc) I would love for you to engage in the collection of public dataset as part of a pilot or formative study. If this is not feasible or useful, we can discuss other options.

I have a strong preference for you to write this paper individually but I'm open to the idea that you may want to work with others in the class.

Participation
The course relies heavily on participation and discussion. It is important to realize that we will not summarize reading in class and I will not cover it in lecture. I expect you all to have read it and we will jump in and start discussing it. The "Participation Rubric" section of my detailed page on assessment gives the rubric I will use in evaluating participation.

Assessment
I have put together a very detailed page that describes the way I approach assessment and grading—both in general and in this course. Please read it carefully I will assign grades for each of following items on the UW 4.0 grade scale according to the weights below:


 * Participation: 30%
 * Weekly Reflection: 15%
 * Proposal identification: 5%
 * Final paper outline: 5%
 * Final Presentation: 10%
 * Final Paper: 35%

Part I: Introduction and Framing
Required Readings:


 * Agre, Phil. 2004. “Internet Research: For and Against.” In Internet Research Annual: Selected Papers from the Association of Internet Researchers Conferences 2000-2002, edited by Mia Consalvo, Nancy Baym, Jeremy Hunsinger, Klaus Bruhn Jensen, John Logie, Monica Muerero, and Leslie Regan Shade. Vol. 1. New York: Peter Lang. http://polaris.gseis.ucla.edu/pagre/research.html. [Available free online]
 * Lazer, David, Alex Pentland, Lada Adamic, Sinan Aral, Albert-Laszlo Barabasi, Devon Brewer, Nicholas Christakis, et al. 2009. “Computational Social Science.” Science 323 (5915): 721–23. https://doi.org/10.1126/science.1167742. [Available through UW libraries]

Optional Reading:


 * December, John. 1996. “Units of Analysis for Internet Communication.” Journal of Computer-Mediated Communication 1 (4): 0–0. https://doi.org/10.1111/j.1083-6101.1996.tb00173.x. [Available free online]
 * Sandvig, Christian, and Eszter Hargittai. 2015. “How to Think about Digital Research.” In Digital Research Confidential: The Secrets of Studying Behavior Online, edited by Eszter Hargittai and Christian Sandvig, 1–28. Cambridge, MA: MIT Press. [Available in Canvas]

Part II: Ethics
Required Readings:


 * franzke, aline shakti, Anja Bechmann, Michael Zimmer, and Charles M. Ess. 2020. “Internet Research: Ethical Guidelines 3.0.” Association of Internet Researchers. https://aoir.org/reports/ethics3.pdf. [Available free online]

To frame a conversation around research ethics, lets read this piece:


 * Kramer, Adam D. I., Jamie E. Guillory, and Jeffrey T. Hancock. 2014. “Experimental Evidence of Massive-Scale Emotional Contagion through Social Networks.” Proceedings of the National Academy of Sciences 111 (24): 8788–90. https://doi.org/10.1073/pnas.1320040111. [Available through UW libraries]

And these pieces that are all vaguely in response to it:


 * Carr, Nicholas. 2014. “The Manipulators: Facebook’s Social Engineering Project.” The Los Angeles Review of Books, September 14, 2014. http://lareviewofbooks.org/essay/manipulators-facebooks-social-engineering-project/. [Available free online]
 * [Skim page to get a sense of the backlash] Grimmelmann, James. 2014. “The Facebook Emotional Manipulation Study: Sources.” The Laboratorium (blog). June 30, 2014. http://laboratorium.net/archive/2014/06/30/the_facebook_emotional_manipulation_study_source. [Available free online]
 * Bernstein, Michael. 2014. “The Destructive Silence of Social Computing Researchers.” Medium (blog). July 7, 2014. https://medium.com/@msbernst/the-destructive-silence-of-social-computing-researchers-9155cdff659. [Available free online]
 * Lampe, Clifford. 2014. “Facebook Is Good for Science.” The Chronicle of Higher Education Blogs: The Conversation (blog). July 8, 2014. http://chronicle.com/blogs/conversation/2014/07/08/facebook-is-good-for-science/. [Available free online]
 * Hancock, Jeffrey T. 2018. “The Ethics of Digital Research.” The Oxford Handbook of Networked Communication, April. https://doi.org/10.1093/oxfordhb/9780190460518.013.25. [Available through UW libraries]

Optional Readings:


 * Department of Health, Education, and Welfare, and National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research. 2014. “The Belmont Report. Ethical Principles and Guidelines for the Protection of Human Subjects of Research.” http://www.hhs.gov/ohrp/policy/belmont.html. [Available free online]
 * Frankel, Mark S., and Sanyin Siang. 1999. “Ethical and Legal Aspects of Human Subject Research on the Internet.” Workshop Report. Washington, DC: American Association for the Advancement of Science. [Available free online]

Part I: Internet Data Collection
Required Readings:


 * Mislove, Alan, and Christo Wilson. 2018. “A Practitioner’s Guide to Ethical Web Data Collection.” In The Oxford Handbook of Networked Communication, edited by Brooke Foucault Welles and Sandra González-Bailón. London, UK: Oxford University Press. https://doi.org/10.1093/oxfordhb/9780190460518.001.0001. [Available through UW libraries]
 * Brügger, Niels. 2018. “Web History and Social Media.” In The SAGE Handbook of Social Media, edited by Jean Burgess, Alice Marwick, and Thomas Poell, 196–212. London, UK: SAGE Publications Ltd. https://doi.org/10.4135/9781473984066. [Available through UW Libraries]
 * Shumate, Michelle, and Matthew S. Weber. 2015. “The Art of Web Crawling for Social Science Research.” In Digital Research Confidential: The Secrets of Studying Behavior Online, edited by Eszter Hargittai and Christian Sandvig, 234–59. Cambridge, MA: The MIT Press. [Available in Canvas]
 * Freelon, Deen. 2018. “Computational Research in the Post-API Age.” Political Communication 35 (4): 665–68. https://doi.org/10.1080/10584609.2018.1477506. [Available through UW Libraries]
 * [Example] Graeff, Erhardt, Matt Stempeck, and Ethan Zuckerman. 2014. “The Battle for ‘Trayvon Martin’: Mapping a Media Controversy Online and Off-Line.” First Monday 19 (2). http://firstmonday.org/ojs/index.php/fm/article/view/4947. [Available free online]

Optional Readings:


 * Ankerson, Megan Sapnar. 2015. “Read/Write the Digital Archive: Strategies for Historical Web Research.” In Digital Research Confidential: The Secrets of Studying Behavior Online, edited by Eszter Hargittai and Christian Sandvig, 29–54. Cambridge, MA: MIT Press. [Available in Canvas]
 * Spaniol, Marc, Dimitar Denev, Arturas Mazeika, Gerhard Weikum, and Pierre Senellart. 2009. “Data Quality in Web Archiving.” In Proceedings of the 3rd Workshop on Information Credibility on the Web, 19–26. WICOW ’09. New York, NY, USA: ACM. https://doi.org/10.1145/1526993.1526999. [Available through UW Libraries]
 * Schneider, Steven M., and Kirsten A. Foot. 2004. “The Web as an Object of Study.” New Media & Society 6 (1): 114–22. https://doi.org/10.1177/1461444804039912. [Available through UW Libraries]
 * Weber, Matthew S. 2014. “Observing the Web by Understanding the Past: Archival Internet Research.” In Proceedings of the Companion Publication of the 23rd International Conference on World Wide Web Companion, 1031–1036. WWW Companion ’14. Republic and Canton of Geneva, Switzerland: International World Wide Web Conferences Steering Committee. https://doi.org/10.1145/2567948.2579213. [Available through UW Libraries]

Optional readings related to the ethics of data collection online:


 * Amy Bruckman's two 2016 blog posts about researchers violating terms of Service (TOS) while doing academic research: Do Researchers Need to Abide by Terms of Service (TOS)? An Answer. and More on TOS: Maybe Documenting Intent Is Not So Smart
 * Digital Millenium Copyright Act and these explanatory/commentary essays & sites:
 * The Electronic Frontier Foundation's page on the DMCA.
 * Templeton, Brad's A Brief Intro to Copyright & 10 Big Myths about Copyright Explained
 * Sections on Copyright, Privacy, and Social Media in the “Internet Case Digest” of the Perkins Coie LLP “Case Digest” site.
 * Narayanan, A., and V. Shmatikov. 2008. “Robust De-Anonymization of Large Sparse Datasets.” In IEEE Symposium on Security and Privacy, 2008. SP 2008, 111–25. https://doi.org/10.1109/SP.2008.33. [Available through UW Libraries]

Two useful sources of data collection:


 * Archive Team is an online community that archives websites. They are a fantastic resource and include many pieces of detailed technical documentation on the practice of engaging in web archiving. For example, here are detailed explanations of mirroring a website with GNU wget which is the piece of free software I usually use to archive websites.
 * OpenHumans is an online community where people share personal data with each other and with researchers.

Part II: Textual Analyses
Required Readings:


 * McMillan, Sally J. 2000. “The Microscope and the Moving Target: The Challenge of Applying Content Analysis to the World Wide Web.” Journalism & Mass Communication Quarterly 77 (1): 80–98. https://doi.org/10.1177/107769900007700107. [Available through UW Libraries]
 * Shah, Dhavan V., Joseph N. Capella, W. Russell Neuman, Rodrigo Zamith, and Seth C. Lewis. 2015. “Content Analysis and the Algorithmic Coder: What Computational Social Science Means for Traditional Modes of Media Analysis.” The ANNALS of the American Academy of Political and Social Science 659 (1): 307–18. https://doi.org/10.1177/0002716215570576. [Available in UW libraries]
 * Grimmer, Justin, and Brandon M. Stewart. 2013. “Text as Data: The Promise and Pitfalls of Automatic Content Analysis Methods for Political Texts.” Political Analysis, January, mps028. https://doi.org/10.1093/pan/mps028. [Available through UW Libraries]
 * DiMaggio, Paul, Manish Nag, and David Blei. 2013. “Exploiting Affinities between Topic Modeling and the Sociological Perspective on Culture: Application to Newspaper Coverage of U.S. Government Arts Funding.” Poetics, Topic Models and the Cultural Sciences, 41 (6): 570–606. https://doi.org/10.1016/j.poetic.2013.08.004. [Available through UW Libraries]
 * Feldman, Ronen. 2013. “Techniques and Applications for Sentiment Analysis.” Communications of the ACM 56 (4): 82–90. https://doi.org/10.1145/2436256.2436274. [Available in UW libraries]

Optional Readings:


 * Trilling, Damian, and Jeroen G. F. Jonkman. 2018. “Scaling up Content Analysis.” Communication Methods and Measures 12 (2–3): 158–74. https://doi.org/10.1080/19312458.2018.1447655. [Available in UW libraries]
 * Leetaru, Kalev Hannes. 2011. Data Mining Methods for the Content Analyst: An Introduction to the Computational Analysis of Content. Routledge Communication Series. New York, NY: Taylor and Francis. [Available through UW libraries].

I'm assuming you have at least a rough familiarity with content analysis as a methodology. If your not as comfortable with this, check out the Wikipedia article to start. These help provide more of a background into content analysis (in general, and online):


 * Van Selm, Martine & Jankowski, Nick, (2005) "Content Analysis of Internet-Based Documents." Unpublished Manuscript. [Available in Canvas]
 * Neuendorf, K. A. (2002). The content analysis guidebook. Thousand Oaks, Calif.: Sage Publications. [Available from Instructor]
 * Krippendorff, K. (2005). Content analysis: an introduction to its methodology. Thousand Oaks; London; New Delhi: Sage. [Available from Instructor]

Examples of more traditional content analysis using online content:


 * Trammell, K. D., Tarkowski, A., Hofmokl, J., & Sapp, A. M. (2006). Rzeczpospolita blogów (Republic of Blog): Examining Polish Bloggers Through Content Analysis. Journal of Computer-Mediated Communication, 11(3), 702–722. [Available Free Online]
 * Woolley, J. K., Limperos, A. M., & Oliver, M. B. (2010). The 2008 Presidential Election, 2.0: A Content Analysis of User-Generated Political Facebook Groups. Mass Communication and Society, 13(5), 631–652. [Available from UW Libraries]'
 * Maier, Daniel, A. Waldherr, P. Miltner, G. Wiedemann, A. Niekler, A. Keinert, B. Pfetsch, et al. 2018. “Applying LDA Topic Modeling in Communication Research: Toward a Valid and Reliable Methodology.” Communication Methods and Measures 12 (2–3): 93–118. https://doi.org/10.1080/19312458.2018.1430754.

Another example of topic modeling from political science:


 * Barberá, P., Bonneau, R., Egan, P., Jost, J. T., Nagler, J., & Tucker, J. (2014). Leaders or Followers? Measuring Political Responsiveness in the US Congress Using Social Media Data. Presented at the Annual Meeting of the American Political Science Association. [Free Online]

Week 2: Friday January 17: CDSW Session 0
As description in the section on technical skills above, I expect everybody who is not comfortable with at least basic programming and data collection to attend the Community Data Science Workshops (Winter 2020) which I am running concurrently with this class.

This session will run from 6-9pm and is the only session which can probably be missed. Please do contact me, however, if you will not be able to attend it.

Week 2: Saturday January 18: CDSW Session 1
As description in the section on technical skills above, I expect everybody who is not comfortable with at least basic programming and data collection to attend the Community Data Science Workshops (Winter 2020) which I am running concurrently with this class.

This session will run from 9am-3pm. Details on the CDSW Winter 2020 page.

Part I: Network Analysis
Required Readings:


 * Lazer, David. 2018. “Networks and Information Flow.” The Oxford Handbook of Networked Communication, April. https://doi.org/10.1093/oxfordhb/9780190460518.013.2. [Available through UW libraries]
 * Garton, Laura, Caroline Haythornthwaite, and Barry Wellman. 1997. “Studying Online Social Networks.” Journal of Computer-Mediated Communication 3 (1): 0–0. https://doi.org/10.1111/j.1083-6101.1997.tb00062.x. [Available free online]
 * Mislove, Alan, Massimiliano Marcon, Krishna P. Gummadi, Peter Druschel, and Bobby Bhattacharjee. 2007. “Measurement and Analysis of Online Social Networks.” In Proceedings of the 7th ACM SIGCOMM Conference on Internet Measurement, 29–42. IMC ’07. New York, NY, USA: ACM. https://doi.org/10.1145/1298306.1298311. [vailable through UW Libraries
 * Shumate, Michelle, and Edward T. Palazzolo. 2010. “Exponential Random Graph (P*) Models as a Method for Social Network Analysis in Communication Research.” Communication Methods and Measures 4 (4): 341–71. https://doi.org/10.1080/19312458.2010.527869. [Available through Canvas]
 * Foucault Welles, Brooke, Anthony Vashevko, Nick Bennett, and Noshir Contractor. 2014. “Dynamic Models of Communication in an Online Friendship Network.” Communication Methods and Measures 8 (4): 223–43. https://doi.org/10.1080/19312458.2014.967843. [Available through Canvas]
 * Freelon, Deen. 2018. “Partition-Specific Network Analysis of Digital Trace Data.” The Oxford Handbook of Networked Communication, April. https://doi.org/10.1093/oxfordhb/9780190460518.013.3. [Available through UW libraries]

Optional Readings:


 * Wellman, Barry. 2016. “Challenges in Collecting Personal Network Data: The Nature of Personal Network Analysis - Barry Wellman, 2007.” Field Methods, July. http://journals.sagepub.com/doi/10.1177/1525822X06299133. [Available through UW libraries]
 * Yang, Jaewon, and Jure Leskovec. 2015. “Defining and Evaluating Network Communities Based on Ground-Truth.” Knowledge and Information Systems 42 (1): 181–213. https://doi.org/10.1007/s10115-013-0693-z. [Available through Canvas]
 * Centola, Damon. 2010. “The Spread of Behavior in an Online Social Network Experiment.” Science 329 (5996): 1194–97. https://doi.org/10.1126/science.1185231. ''[Available through UW Libraries]'
 * [Example] Jackson, Sarah J., and Brooke Foucault Welles. 2015. “Hijacking #myNYPD: Social Media Dissent and Networked Counterpublics.” Journal of Communication 65 (6): 932–52. https://doi.org/10.1111/jcom.12185. ''[Available through UW Libraries]'

Network datasets:


 * Stanford Large Network Dataset Collection which contains a variety of network datasets. Many, but certainly not all, are social networks.

Part II: Hyperlink Networks

 * Park, Han Woo. 2003. “Hyperlink Network Analysis: A New Method for the Study of Social Structure on the Web.” Connections 25 (1): 49–61. [Available through Canvas]
 * González-Bailón, Sandra. 2009. “Opening the Black Box of Link Formation: Social Factors Underlying the Structure of the Web.” Social Networks 31 (4): 271–80. https://doi.org/10.1016/j.socnet.2009.07.003. [Available through UW libraries]
 * [Example] Elgin, Dallas J. 2015. “Utilizing Hyperlink Network Analysis to Examine Climate Change Supporters and Opponents.” Review of Policy Research 32 (2): 226–45. https://doi.org/10.1111/ropr.12118. [Available through UW libraries]

Optional readings:


 * Jackson, Michele H. 1997. “Assessing the Structure of Communication on the World Wide Web.” Journal of Computer-Mediated Communication 3 (1). https://doi.org/10.1111/j.1083-6101.1997.tb00063.x. [Available free online]
 * Ackland, Robert. 2016. “WWW Hyperlink Networks.” Lecture Slides presented at the SOCR8006 Online Research Methods, Canberra, Australia. http://vosonlab.net/papers/ACSPRISummer2017/Lecture_Hyperlink_Networks.pdf. [Available free online]
 * Lusher, Dean, and Robert Ackland. 2011. “A Relational Hyperlink Analysis of an Online Social Movement.” Journal of Social Software 12 (5). https://www.cmu.edu/joss/content/articles/volume12/Lusher/. [Available free online]
 * [Example] Shumate, Michelle, and Lori Dewitt. 2008. “The North/South Divide in NGO Hyperlink Networks.” Journal of Computer-Mediated Communication 13 (2): 405–28. https://doi.org/10.1111/j.1083-6101.2008.00402.x. [Available free online]

Tools for collecting hyperlink network data:


 * Issue Crawler — network mapping software by the Govcom.org Foundation, Amsterdam in a group run by Richard Rogers
 * Virtual Observatory for the Study of Online Networks (VOSON) — "web-based software incorporating web mining, data visualisation, and traditional empirical social science methods (e.g. social network analysis, SNA). Text analysis, dataset manipulation and visualisation, and social network analysis (SNA) are available within an integrated environment."

Part I: Digital & Trace Ethnography
Required Readings:

More traditional ethnographic research in online settings:


 * Hines, Christine. 2017. “Ethnographies of Online Communities and Social Media: Modes, Varieties, Affordances.” In The SAGE Handbook of Online Research Methods, edited by Nigel G. Fielding, Raymond M. Lee, and Grant Blank, 2 edition, 401–15. London, UK: SAGE. [Available in Canvas]
 * [Selections] Jemielniak, Dariusz. 2014. Common Knowledge?: An Ethnography of Wikipedia. Stanford, California: Stanford University Press. ["Introduction" and "Appendix A: Methodology"; Available in Canvas]

Material on "Trace" and "network" ethnographies:


 * Geiger, R. Stuart, and David Ribes. 2011. “Trace Ethnography: Following Coordination Through Documentary Practices.” In Proceedings of the 2011 44th Hawaii International Conference on System Sciences, 1–10. HICSS ’11. Washington, DC, USA: IEEE Computer Society. https://doi.org/10.1109/HICSS.2011.455. [Available in Canvas]
 * Geiger, R. Stuart, and Aaron Halfaker. 2017. “Operationalizing Conflict and Cooperation between Automated Software Agents in Wikipedia: A Replication and Expansion of ‘Even Good Bots Fight.’” Proceedings of the ACM on Human-Computer Interaction 1 (CSCW): 49:1–49:33. https://doi.org/10.1145/3134684. [Available through UW libraries]
 * Howard, Philip N. 2002. “Network Ethnography and the Hypermedia Organization: New Media, New Organizations, New Methods.” New Media & Society 4 (4): 550–74. https://doi.org/10.1177/146144402321466813. [Available through UW libraries]

Optional Readings:


 * Hine, Christine. 2000. Virtual Ethnography. London, UK: SAGE Publications. [Available from the Instructor]
 * This is the canonical book-length account and the main citation in this space.


 * Coleman, E. Gabriella. 2010. “Ethnographic Approaches to Digital Media.” Annual Review of Anthropology 39 (1): 487–505. https://doi.org/10.1146/annurev.anthro.012809.104945. [Available through UW libraries]
 * Response by danah boyd To Hine's "Question One: How Can Qualitative Internet Researchers Define the Boundaries of Their Projects?" from Internet Inquiry: Conversations About Method, Annette Markham and Nancy Baym (Eds.), Sage, 2009, pp. 1-32. [Available in Canvas]
 * Note: You may also be interest in reading the essay by Hine that boyd is responding to. [Available in Canvas]


 * Hjorth, Larissa, Heather Horst, Anne Galloway, and Genevieve Bell, eds. 2016. The Routledge Companion to Digital Ethnography. New York, NY: Routledge. [Available from the instructor]
 * Sinanan, Jolynna, and Tom McDonald. 2018. “Ethnography.” In The SAGE Handbook of Social Media, 179–95. 55 City Road: SAGE Publications Ltd. https://doi.org/10.4135/9781473984066. [Available through UW libraries]
 * Maxwell, Joseph A. 2002. “Understanding and Validity in Qualitative Research.” In The Qualitative Researcher’s Companion, edited by A. M. Huberman and Matthew B. Miles, 37–64. London, UK: SAGE. [Available in Canvas]
 * Champion, Kaylea, Nora McDonald, Stephanie Bankes, Joseph Zhang, Rachel Greenstadt, Andrea Forte, and Benjamin Mako Hill. 2019. “A Forensic Qualitative Analysis of Contributions to Wikipedia from Anonymity Seeking Users.” Proceedings of the ACM on Human-Computer Interaction 3 (CSCW): 53:1–53:26. https://doi.org/10.1145/3359155. [Available through UW libraries]

These are all other interesting and/or frequently cited examples of Internet-based ethnographies:


 * Geiger, R. Stuart, and David Ribes. 2010. “The Work of Sustaining Order in Wikipedia:The Banning of a Vandal.” In Proceedings of the 2010 ACM Conference on Computer Supported Cooperative Work, 117–126. CSCW ’10. New York, NY, USA: ACM. https://doi.org/10.1145/1718918.1718941. [Available through UW libraries] — A trace ethnography and sort of the companion paper/substantive paper for the methods piece included in the required readings above.
 * Brotsky, Sarah R., and David Giles. 2007. “Inside the ‘Pro-Ana’ Community: A Covert Online Participant Observation.” Eating Disorders 15 (2): 93–109. https://doi.org/10.1080/10640260701190600. [Available through UW libraries]
 * Note: To conduct the study reported in this paper the authors created a used a fake profile in order to observe the psychological support offered to participants.


 * Williams, Matthew. 2007. “Avatar Watching: Participant Observation in Graphical Online Environments.” Qualitative Research 7 (1): 5–24. https://doi.org/10.1177/1468794107071408. [Available through UW libraries]
 * Note: Fantastic more general introduction but takeaways that are more specifically targeted toward people studying virtual reality type environments with virtual physicality.

Charlie's optional readings (virtual world ethnographies):


 * Bainbridge, William Sims. 2010. The Warcraft Civilization: Social Science in a Virtual World. Cambridge, Massachusetts: MIT. [mitpress https://mitpress.mit.edu/books/warcraft-civilization]
 * Nardi, Bonnie A. 2009. My Life as a Night Elf Priest: An Anthropological Account of World of Warcraft. Ann Arbor, Michigan: University of Michigan. [free pdfs https://muse.jhu.edu/book/1093]
 * Pearce, Celia, Tom Boellstorff, and Bonnie A. Nardi. 2011. Communities of Play: Emergent Cultures in Multiplayer Games and Virtual Worlds. The MIT Press. [mitpress https://mitpress.mit.edu/books/communities-play]
 * Boellstorff, Tom, Bonnie Nardi, Celia Pearce, T. L. Taylor, and George E. Marcus. 2012. Ethnography and Virtual Worlds: A Handbook of Method. Princeton: Princeton University Press.

Part II: Online Interviewing
Required Readings:


 * O’Connor, Henrietta, and Clare Madge. 2017. “Internet-Based Interviewing.” In The SAGE Handbook of Online Research Methods, edited by Nigel G. Fielding, Raymond M. Lee, and Grant Blank, 2 edition, 416–34. London, UK: SAGE. [Available through Canvas]
 * Abrams, Katie M ., and Ted J. Gaiser. 2017. “Online Focus Groups.” In The SAGE Handbook of Online Research Methods, edited by Nigel G. Fielding, Raymond M. Lee, and Grant Blank, 2 edition, 435–50. London, UK: SAGE. [Available through Canvas]
 * Hanna, Paul. 2012. “Using Internet Technologies (Such as Skype) as a Research Medium: A Research Note.” Qualitative Research 12 (2): 239–42. https://doi.org/10.1177/1468794111426607. [Available through UW libraries]
 * Note: Short article you can basically skim. Read it quickly so you can cite it later.


 * Dowling, Sally. 2012. “Online Asynchronous and Face-to-Face Interviewing: Comparing Methods for Exploring Women’s Experiences of Breastfeeding Long Term.” In Cases in Online Interview Research, edited by Janet Salmons, 277–303. 2455 Teller Road, Thousand Oaks  California  91320  United States: SAGE Publications, Inc. http://srmo.sagepub.com/view/cases-in-online-interview-research/n11.xml. [Available through UW libraries]

Optional Readings:


 * boyd, danah. 2015. “Making Sense of Teen Life: Strategies for Capturing Ethnographic Data in a Networked Era.” In Digital Research Confidential: The Secrets of Studying Behavior Online, edited by Eszter Hargittai and Christian Sandvig. Cambridge, Massachusetts: MIT Press. [Available in Canvas]
 * Note: Strongly focused on enthnographic interviews with tons of very specific details. Fantastic article on interviewing, although perhaps a bit weak on Internet specific advice.


 * Markham, Annette N. 1998. “The Shifting Project, The Shifting Self.” In Life Online: Researching Real Experience in Virtual Space, 61–83. Rowman Altamira. [Available from instructor]
 * Note: One of the earliest books on online life and one of the earliest attempts to do online interviewing. This is dated, but highlight some important challenge.


 * Hutchinson, Emma. 2016. “Digital Methods and Perpetual Reinvention? Asynchronous Interviewing and Photo Elicitation.” In Digital Methods for Social Science: An Interdisciplinary Guide to Research Innovation, edited by Helene Snee, Christine Hine, Yvette Morey, Steven Roberts, and Hayley Watson, 143–56. London: Palgrave Macmillan UK. https://doi.org/10.1057/9781137453662_9. [Available through UW libraries]
 * Hawkins, Janice. 2018. “The Practical Utility and Suitability of Email Interviews in Qualitative Research.” The Qualitative Report 23 (2). https://digitalcommons.odu.edu/nursing_fac_pubs/24. [Available free online]

Alternate Accounts:

These texts are largely redundant to the required texts above but do provide a different perspective and examples:


 * Salmons, Janet. 2014. Qualitative Online Interviews: Strategies, Design, and Skills. SAGE Publications. [Preface, TOC, and Chapter 1; Available in Canvas]
 * This is a book that lays out what claims to be a comprehensive account to online interviewing. I have the book and am happy to loan my copy to anybody in the class that thinks this will be a core part of their research.

Optional readings related to the ethics of identify subjects:


 * Markham, Annette. 2012. “Fabrication as Ethical Practice.” Information, Communication & Society 15 (3): 334–53. https://doi.org/10.1080/1369118X.2011.641993. [Available through UW libraries]
 * Trevisan, Filippo, and Paul Reilly. 2014. “Ethical Dilemmas in Researching Sensitive Issues Online: Lessons from the Study of British Disability Dissent Networks.” Information, Communication & Society 17 (9): 1131–46. https://doi.org/10.1080/1369118X.2014.889188. [Available through UW libraries]

Week 4: Saturday February 1: CDSW Session 2
As description in the section on technical skills above, I expect everybody who is not comfortable with at least basic programming and data collection to attend the Community Data Science Workshops (Winter 2020) which I am running concurrently with this class.

This session will run from 10am-4pm. Details on the CDSW Winter 2020 page.

Part I: Surveys
Required Readings:


 * Fricker, Jr., Ronald D., and Katja Lozar Manfreda. 2017. “Sampling Methods for Online Surveys.” In The SAGE Handbook of Online Research Methods, edited by Nigel G. Fielding, Raymond M. Lee, and Grant Blank, 2 edition, 162–83. London, UK: SAGE. [Available in Canvas]
 * Walejko, Gina. 2009. “Online Survey: Instant Publication, Instant Mistake, All of the Above.” In Research Confidential: Solutions to Problems Most Social Scientists Pretend They Never Have, edited by Eszter Hargittai, 101–21. Ann Arbor, MI: University of Michigan Press. [Available in Canvas]
 * Konstan, Joseph A., B. R. Simon Rosser, Michael W. Ross, Jeffrey Stanton, and Weston M. Edwards. 2005. “The Story of Subject Naught: A Cautionary but Optimistic Tale of Internet Survey Research.” Journal of Computer-Mediated Communication 10 (2): 00–00. https://doi.org/10.1111/j.1083-6101.2005.tb00248.x. [Free online]
 * Hill, Benjamin Mako, and Aaron Shaw. 2013. “The Wikipedia Gender Gap Revisited: Characterizing Survey Response Bias with Propensity Score Estimation.” PLoS ONE 8 (6): e65782. https://doi.org/10.1371/journal.pone.0065782. [Free online]
 * Salganik, Matthew J., and Karen E. C. Levy. 2015. “Wiki Surveys: Open and Quantifiable Social Data Collection.” PLOS ONE 10 (5): e0123483. https://doi.org/10.1371/journal.pone.0123483. [Free online]
 * Note: This journalistic account of the research may also be useful.


 * Alperin, Juan Pablo, Erik Warren Hanson, Kenneth Shores, and Stefanie Haustein. 2017. “Twitter Bot Surveys: A Discrete Choice Experiment to Increase Response Rates.” In Proceedings of the 8th International Conference on Social Media & Society, 1–4. #SMSociety17. Toronto, ON, Canada: Association for Computing Machinery. https://doi.org/10.1145/3097286.3097313. [Available through UW libraries]

Optional Readings:


 * Van Selm, Martine, and Nicholas W. Jankowski. 2006. “Conducting Online Surveys.” Quality and Quantity 40 (3): 435–56. https://doi.org/10.1007/s11135-005-8081-8. [Available through UW Libraries]
 * Vehovar, Vasja, and Katja Lozar Manfreda. 2017. “Overview: Online Surveys.” In The SAGE Handbook of Online Research Methods, edited by Nigel G. Fielding, Raymond M. Lee, and Grant Blank, 2 edition, 143–61. London, UK: SAGE. [Available in Canvas]
 * Kaczmirek, Lars. 2017. “Online Survey Software.” In The SAGE Handbook of Online Research Methods, edited by Nigel G. Fielding, Raymond M. Lee, and Grant Blank, 2 edition, 203–19. London, UK: SAGE. [Available in Canvas]
 * Toepoel, Vera. 2017. “Online Survey Design.” In The SAGE Handbook of Online Research Methods, edited by Nigel G. Fielding, Raymond M. Lee, and Grant Blank, 2 edition, 184–202. London, UK: SAGE. [Available in Canvas]
 * Mavletova, Aigul, and Mick P. Couper. 2014. “Mobile Web Survey Design: Scrolling versus Paging, SMS versus E-Mail Invitations.” Journal of Survey Statistics and Methodology 2 (4): 498–518. https://doi.org/10.1093/jssam/smu015. ''[Available through UW libraries]'
 * Yun, Gi Woong, and Craig W. Trumbo. 2000. “Comparative Response to a Survey Executed by Post, e-Mail, & Web Form.” Journal of Computer-Mediated Communication 6 (1): 0–0. https://doi.org/10.1111/j.1083-6101.2000.tb00112.x. [Free online]
 * Hargittai, Eszter, and Chris Karr. 2009. “WAT R U DOIN? Studying the Thumb Generation Using Text Messaging.” In Research Confidential: Solutions to Problems Most Social Scientists Pretend They Never Have, edited by Eszter Hargittai, 192–216. Ann Arbor, MI: University of Michigan Press. [Available in Canvas]

If you don't have a background in survey design, these two have been recommended by our guest speaker as good basic things to read:


 * Krosnick, Jon A. 1999. “Maximizing Measurement Quality: Principles of Good Questionnaire Design.” In Measures of Political Attitudes, edited by John P. Robinson, Phillip R. Shaver, and Lawrence S. Wrightsman. New York: Academic Press.
 * Krosnick, Jon A. 1999. “Survey Research.” Annual Review of Psychology 50 (1): 537–67. https://doi.org/10.1146/annurev.psych.50.1.537. [Available through UW libraries]

Tools for doing mobile surveys:


 * RapidSMS
 * Twilio

Part II: Experiments
Required Readings:


 * Reips, Ulf-Dietrich. 2002. “Standards for Internet-Based Experimenting.” Experimental Psychology 49 (4): 243–56. https://doi.org/10.1026//1618-3169.49.4.243. ''[Available through UW libraries]'
 * Salganik, Matthew J., Peter Sheridan Dodds, and Duncan J. Watts. 2006. “Experimental Study of Inequality and Unpredictability in an Artificial Cultural Market.” Science 311 (5762): 854–56. https://doi.org/10.1126/science.1121066. ''[Available through UW libraries]'
 * Hergueux, Jérôme, and Nicolas Jacquemet. 2014. “Social Preferences in the Online Laboratory: A Randomized Experiment.” Experimental Economics 18 (2): 251–83. https://doi.org/10.1007/s10683-014-9400-5. [Available in Canvas]
 * Rijt, Arnout van de, Soong Moon Kang, Michael Restivo, and Akshay Patil. 2014. “Field Experiments of Success-Breeds-Success Dynamics.” Proceedings of the National Academy of Sciences 111 (19): 6934–39. https://doi.org/10.1073/pnas.1316836111. [Available in Canvas]
 * Narayan, Sneha, Nathan TeBlunthuis, Wm Salt Hale, Benjamin Mako Hill, and Aaron Shaw. 2019. “All Talk: How Increasing Interpersonal Communication on Wikis May Not Enhance Productivity.” Proceedings of the ACM on Human-Computer Interaction 3 (CSCW): 101:1–101:19. https://doi.org/10.1145/3359203. ''[Available through UW libraries]'

Optional Readings:


 * Eckles, Dean, Brian Karrer, and Johan Ugander. 2017. “Design and Analysis of Experiments in Networks: Reducing Bias from Interference.” Journal of Causal Inference 5 (1). https://doi.org/10.1515/jci-2015-0021. ''[Available through UW libraries]'
 * This piece is set as the intersection of networks and experiments. It's very important but is probably too technical to assign for the whole c.ass


 * Kohavi, Ron, Alex Deng, Brian Frasca, Toby Walker, Ya Xu, and Nils Pohlmann. 2013. “Online Controlled Experiments at Large Scale.” In Proceedings of the 19th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 1168–1176. KDD ’13. Chicago, Illinois, USA: Association for Computing Machinery. https://doi.org/10.1145/2487575.2488217. ''[Available through UW libraries]'
 * Reinecke, Katharina, and Krzysztof Z. Gajos. 2015. “LabintheWild: Conducting Large-Scale Online Experiments With Uncompensated Samples.” In Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing, 1364–1378. CSCW ’15. New York, NY, USA: ACM. https://doi.org/10.1145/2675133.2675246. ''[Available through UW libraries]'
 * Zhu, Haiyi, Amy Zhang, Jiping He, Robert E. Kraut, and Aniket Kittur. 2013. “Effects of Peer Feedback on Contribution: A Field Experiment in Wikipedia.” In, 2253. ACM Press. https://doi.org/10.1145/2470654.2481311. ''[Available through UW libraries]'
 * Zhang, Xiaoquan (Michael), and Feng Zhu. 2011. “Group Size and Incentives to Contribute: A Natural Experiment at Chinese Wikipedia.” American Economic Review 101 (4): 1601–15. https://doi.org/10.1257/aer.101.4.1601. ''[Available through UW libraries]'
 * Weninger, Tim, Thomas James Johnston, and Maria Glenski. 2015. “Random Voting Effects in Social-Digital Spaces: A Case Study of Reddit Post Submissions.” Pp. 293–297 in Proceedings of the 26th ACM Conference on Hypertext & Social Media, HT ’15. Guzelyurt, Northern Cyprus: Association for Computing Machinery.

Week 6: Tuesday February 11: Crowdsourcing, Digital Labor Markets, and Human Computation

 * Note: I've marked things as [Required] below if they are required because I thought it made more sense to keep the topics groups of articles below intact.

MTurk documentation and guidelines:


 * [Required] Amazon Mechanical Turk Requester UI Guide — Skim, but make sure you're ready to submit tasks.
 * [Required] Amazon Mechanical Turk Best Practices Guide — Skim, but make sure you're ready to submit tasks.
 * [Required] Shaw, Aaron. 2015. “Hired Hands and Dubious Guesses: Adventures in Crowdsourced Data Collection.” In Digital Research Confidential: The Secrets of Studying Behavior Online, edited by Eszter Hargittai and Christian Sandvig. The MIT Press. [Available in Canvas]
 * [Required] Tutorials Posted on the MTurk blog — Skim and browse and pay attention to things that are like what you'd like to do in the class session.
 * [Required] Guidelines for Academic Requesters and Basics of How to be a good Requester from the We Are Dynamo — These sets of guidelines were created by Turkers as part of an effort to engage in collective actions and organizer of Turkers run by Niloufar Saleh in the paper below.
 * Mason, Winter, and Siddharth Suri. 2011. “Conducting Behavioral Research on Amazon’s Mechanical Turk.” Behavior Research Methods 44 (1): 1–23. https://doi.org/10.3758/s13428-011-0124-6. — Dated but still somewhat useful.

Overviews of MTurk and issues of data quality:


 * Horton, John J., David G. Rand, and Richard J. Zeckhauser. 2011. “The Online Laboratory: Conducting Experiments in a Real Labor Market.” Experimental Economics 14 (3): 399–425. https://doi.org/10.1007/s10683-011-9273-9.
 * Buhrmester, Michael, Tracy Kwang, and Samuel D. Gosling. 2011. “Amazon’s Mechanical Turk: A New Source of Inexpensive, yet High-Quality, Data?” Perspectives on Psychological Science, February. https://doi.org/10.1177/1745691610393980.
 * Casler, Krista, Lydia Bickel, and Elizabeth Hackett. 2013. “Separate but Equal? A Comparison of Participants and Data Gathered via Amazon’s MTurk, Social Media, and Face-to-Face Behavioral Testing.” Computers in Human Behavior 29 (6): 2156–60. https://doi.org/10.1016/j.chb.2013.05.009.
 * [Required] Weinberg, Jill, Jeremy Freese, and David McElhattan. 2014. “Comparing Data Characteristics and Results of an Online Factorial Survey between a Population-Based and a Crowdsource-Recruited Sample.” Sociological Science 1: 292–310. https://doi.org/10.15195/v1.a19.
 * Kees, Jeremy, Christopher Berry, Scot Burton, and Kim Sheehan. 2017. “An Analysis of Data Quality: Professional Panels, Student Subject Pools, and Amazon’s Mechanical Turk.” Journal of Advertising 46 (1): 141–55. https://doi.org/10.1080/00913367.2016.1269304.
 * [Required] Kennedy, Ryan, Scott Clifford, Tyler Burleigh, Ryan Jewell, and Philip Waggoner. 2018. “The Shape of and Solutions to the MTurk Quality Crisis.” SSRN Scholarly Paper ID 3272468. Rochester, NY: Social Science Research Network. https://papers.ssrn.com/abstract=3272468. [Available free online]

Culture and work conditions for Turkers:


 * Irani, Lilly. 2015. “The Cultural Work of Microwork.” New Media & Society 17 (5): 720–39. https://doi.org/10.1177/1461444813511926.
 * Kittur, Aniket, Jeffrey V. Nickerson, Michael Bernstein, Elizabeth Gerber, Aaron Shaw, John Zimmerman, Matt Lease, and John Horton. 2013. “The Future of Crowd Work.” In Proceedings of the 2013 Conference on Computer Supported Cooperative Work, 1301–1318. CSCW ’13. San Antonio, Texas, USA: Association for Computing Machinery. https://doi.org/10.1145/2441776.2441923.
 * Gray, Mary L., Siddharth Suri, Syed Shoaib Ali, and Deepti Kulkarni. 2016. “The Crowd Is a Collaborative Network.” In Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing, 134–147. CSCW ’16. San Francisco, California, USA: Association for Computing Machinery. https://doi.org/10.1145/2818048.2819942.
 * [Required] Semuels, Alana. 2018. “The Internet Is Enabling a New Kind of Poorly Paid Hell.” The Atlantic. January 23, 2018. https://www.theatlantic.com/business/archive/2018/01/amazon-mechanical-turk/551192/.

Systems to approve Turker experiences:


 * Salehi, Niloufar, Lilly C. Irani, Michael S. Bernstein, Ali Alkhatib, Eva Ogbe, Kristy Milland, and Clickhappier. 2015. “We Are Dynamo: Overcoming Stalling and Friction in Collective Action for Crowd Workers.” In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, 1621–1630. CHI ’15. Seoul, Republic of Korea: Association for Computing Machinery. https://doi.org/10.1145/2702123.2702508.
 * Irani, Lilly C., and M. Six Silberman. 2013. “Turkopticon: Interrupting Worker Invisibility in Amazon Mechanical Turk.” In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 611–620. CHI ’13. Paris, France: Association for Computing Machinery. https://doi.org/10.1145/2470654.2470742.

Assignments to complete before class:

The first task is to complete a task a crowd worker:


 * If you are a US citizen: Sign up as a worker on MTurk. Find and complete at least 5 "hits" as a worker on Amazon Mechnical Turk. Note that to do this you will need to create a worker account on Mturk.
 * If you are not a US citizen or if you cannot sign up on MTurk for some other reason: Complete at least 3-4 classification tasks in at least 2 different Zooniverse projects of your choice. Also complete at least one "study" in Lab in the Wild
 * In either case: Record (write down) details and notes about your tasks: What did you do? Who was the requester? What could you was the purpose of the task (as best you could tell)? What was the experience like? What research applications can you (not) imagine for this kind of system?

The second task is to get ready to launch a task as a requestor. We will design and launch tasks in class but I want you to do the following ahead of time:


 * Create a "requester" account on Amazon Mechnical Turk. Doing so may require up top 48 hours to be approved so please do that immediately so you have it ready to go in class.
 * Put money onto your requestor account to pay workers. A $5 budget should be sufficient for our class. They should take any payment that Amazon does.
 * Think of at least one small classification or coding task (e.g., of Tweets, images, etc) and one human subjects data collection tasks like a survey, a survey experiment, etc, that you would like to run. You will have a budget of $5 to run the task!
 * If running this task will involve some data (e.g., a set of images or URLs, a set of Tweets, etc), collect that material in a spreadsheet before class. If it will involve a survey, create your survey in a Google Form and/or a Survey Monkey or Qualtrics survey before class.

Week 6: Saturday February 15: CDSW Session 3
As description in the section on technical skills above, I expect everybody who is not comfortable with at least basic programming and data collection to attend the Community Data Science Workshops (Winter 2020) which I am running concurrently with this class.

This session will run from 9am-3pm. Details on the CDSW Winter 2020 page.

Week 7: Tuesday February 18: Consulting Week (i.e., no group meeting)
During this week, we not meet together. Instead, I will schedule one-on-one in person meetings of an hour with each student individually to catch up with you about your project and to work directly with you to resolve any technical issues you have run into with data collection, etc.

Part I: Discourse Analysis
Required Readings:


 * Mitra, Ananda. 1999. “Characteristics of the WWW Text: Tracing Discursive Strategies.” Journal of Computer-Mediated Communication 5 (1): 0–0. https://doi.org/10.1111/j.1083-6101.1999.tb00330.x.
 * Thurlow, Crispin. 2018. “Digital Discourse: Locating Language in New/Social Media.” In The SAGE Handbook of Social Media, edited by Jean Burgess, Alice Marwick, and Thomas Poell, 135–45. London, UK: SAGE. https://doi.org/10.4135/9781473984066.
 * Brock, André. 2018. “Critical Technocultural Discourse Analysis.” New Media & Society 20 (3): 1012–30. https://doi.org/10.1177/1461444816677532.

Optional Readings:


 * Kaun, Anne. 2010. “Open-Ended Online Diaries: Capturing Life as It Is Narrated.” International Journal of Qualitative Methods 9 (2): 133–48. https://doi.org/10.1177/160940691000900202.

Part II: Visual Analysis
Required Readings:


 * Faulkner, Simon, Farida Vis, and Francesco D’Orazio. 2018. “Analysing Social Media Images.” In The SAGE Handbook of Social Media, edited by Jean Burgess, Alice Marwick, and Thomas Poell, 160–78. London, UK: SAGE. https://doi.org/10.4135/9781473984066.
 * Casas, Andreu, and Nora Webb Williams. 2019. “Images That Matter: Online Protests and the Mobilizing Role of Pictures.” Political Research Quarterly 72 (2): 360–75. https://doi.org/10.1177/1065912918786805.
 * Casas, Andreu, and Nora Webb Williams. 2017. “Computer Vision for Political Science Research: A Study of Online Protest Images.” In . College Park, PA: Pennsylvania State University. http://andreucasas.com/casas_webb_williams_NewFaces2017_images_as_data.pdf.
 * Hochman, Nadav, and Raz Schwartz. 2012. “Visualizing Instagram: Tracing Cultural Visual Rhythms.” In Sixth International AAAI Conference on Weblogs and Social Media. http://www.aaai.org/ocs/index.php/ICWSM/ICWSM12/paper/view/4782.

Optional Readings:


 * Torralba, Antonio. 2009. “Understanding Visual Scenes.” Tutorial presented at the NIPS, Vancouver, BC, Canada. http://videolectures.net/nips09_torralba_uvs/.
 * Note: This is a two part (each part is one hour) lecture and tutorial by a expert in computer vision. I strongly recommend watching Part I. I think this gives you a good sense of the nature of the kinds of challenges that were (and still are) facing the field of computer vision and anybody trying to have their computer look at images.


 * Hochman, Nadav, and Lev Manovich. 2013. “Zooming into an Instagram City: Reading the Local through Social Media.” First Monday 18 (7). https://firstmonday.org/article/view/4711/3698.

These five paper are all technical approaches to doing image classification using datasets from Internet-based datasets of images like Flickr, Google Image Search, Google Street View, or Instagram. Each of these describes interesting and challenges technical issues. If you're interested, it would be a great idea to read these to get a sense for the state of the art and what is and isn't possible:


 * Jaffe, Alexandar, Mor Naaman, Tamir Tassa, and Marc Davis. 2006. “Generating Summaries and Visualization for Large Collections of Geo-Referenced Photographs.” In Proceedings of the 8th ACM International Workshop on Multimedia Information Retrieval, 89–98. MIR ’06. New York, NY, USA: ACM. https://doi.org/10.1145/1178677.1178692.
 * Simon, Ian, Noah Snavely, and Steven M. Seitz. 2007. “Scene Summarization for Online Image Collections.” In Computer Vision, IEEE International Conference On, 0:1–8. Los Alamitos, CA, USA: IEEE Computer Society. https://doi.org/10.1109/ICCV.2007.4408863.
 * Crandall, David J., Lars Backstrom, Daniel Huttenlocher, and Jon Kleinberg. 2009. “Mapping the World’s Photos.” In Proceedings of the 18th International Conference on World Wide Web, 761–770. WWW ’09. New York, NY, USA: ACM. https://doi.org/10.1145/1526709.1526812.
 * San Pedro, Jose, and Stefan Siersdorfer. 2009. “Ranking and Classifying Attractiveness of Photos in Folksonomies.” In Proceedings of the 18th International Conference on World Wide Web, 771–780. WWW ’09. New York, NY, USA: ACM. https://doi.org/10.1145/1526709.1526813.
 * Doersch, Carl, Saurabh Singh, Abhinav Gupta, Josef Sivic, and Alexei A. Efros. 2012. “What Makes Paris Look like Paris?” ACM Trans. Graph. 31 (4): 101:1–101:9. https://doi.org/10.1145/2185520.2185597.

Week 9: Tuesday March 3: Consulting Week
During this week, we not meet together. Instead, I will schedule one-on-one in person meetings of an hour with each student individually to catch up with you about your project and to work directly with you to resolve any technical issues you have run into with data collect.

Week 10: Tuesday March 10: Final Presentations
<!--

Part I: Design Research
Today we'll have a guest visitor — Andrés Monroy-Hernández who is director of HCI research at SNAP and formerly from Microsoft Resarch's FUSE labs. Andrés is affiliate faculty in the Department of Communication and Department of Human-Centered Design and Engineering at UW. Monroy-Hernández research involves studying people by designing and building systems. He's built a number of very large and successful socio-technical systems as part of his research. In his graduate work, he build the Scratch Online Community which is now used by more than 10 million people.

I've asked him to come and talk to us about design research as a process. As a result, it will be helpful to read about two projects he has worked on recently that he will talked to us about. Those projects are called NewsPad and Eventful.

Required Readings:


 * Olsen, D. R., Jr. (2007). Evaluating User Interface Systems Research. In Proceedings of the 20th Annual ACM Symposium on User Interface Software and Technology (pp. 251–258). New York, NY, USA: ACM. [Available through UW Libraries]
 * J. Nathan Matias and Andres Monroy-Hernandez, NewsPad: Designing for Collaborative Storytelling in Neighborhoods. CHI Work in Progress Paper. ACM, March 2014.
 * Elena Agapie, Jaime Teevan, and Andrés Monroy-Hernández, Crowdsourcing in the Field: A Case Study Using Local Crowds for Event Reporting, in Human Computation (HCOMP), AAAI - Association for the Advancement of Artificial Intelligence, August 2015.
 * Two very short videos describing the systems: NewsPad by FUSE Labs and Eventful by FUSE Labs

Part II: Digital Trace and Sensor Data
Required Readings:

Read any 2 of these 4 chapters from the Handbook of Emerging Technology in Social Research:


 * Eagle, Nathan, "Mobile phones as sensors for social research," Ch. 22 in HET.
 * Visser, Albertine and Ingrid Mulder, "Emergent technologies for assessing social feelings and experiences," Ch. 16 in HET.
 * de Haan, Geert, et. al., "Bringing the research lab into everyday life: Exploiting sensitive environments to acquire data for social research," Ch. 23 in HET.
 * Fowler, Chris, et. al., "Living laboratories: Social research applications and evaluations," Ch. 27 in HET.
 * Holohan, Anne, et. al., "The digital home: A new locus of social science research," Ch. 28 in HET.

-->

Your Presence in Class
As detailed in the section on participation and in my page on assessment, class participation is a critical way that I will assess learning in the class. Obviously, you must be in class in order to participate. If you need to miss class for any reason, please contact me ahead of time (email is best). In the event of an absence, you are responsible for obtaining class notes, handouts, assignments, etc.

Office Hours
I will hold office Hours on Thursdays 1-2pm in Communications (CMU) 333. In addition to my scheduled office hours, I am generally in my lab in CMU 306. Feel free to stop by at any time or to contact me to arrange a time to meet.

Religious Accommodations
Washington state law requires that UW develop a policy for accommodation of student absences or significant hardship due to reasons of faith or conscience, or for organized religious activities. The UW’s policy, including more information about how to request an accommodation, is available at Religious Accommodations Policy. Accommodations must be requested within the first two weeks of this course using the Religious Accommodations Request form.

Student Conduct
The University of Washington Student Conduct Code (WAC 478-121) defines prohibited academic and behavioral conduct and describes how the University holds students accountable as they pursue their academic goals. Allegations of misconduct by students may be referred to the appropriate campus office for investigation and resolution. More information can be found online at https://www.washington.edu/studentconduct/ Safety

Call SafeCampus at 206-685-7233 anytime–no matter where you work or study–to anonymously discuss safety and well-being concerns for yourself or others. SafeCampus’s team of caring professionals will provide individualized support, while discussing short- and long-term solutions and connecting you with additional resources when requested.

Academic Dishonesty
This includes: cheating on assignments, plagiarizing (misrepresenting work by another author as your own, paraphrasing or quoting sources without acknowledging the original author, or using information from the internet without proper citation), and submitting the same or similar paper to meet the requirements of more than one course without instructor approval. Academic dishonesty in any part of this course is grounds for failure and further disciplinary action. The first incident of plagiarism will result in the student’s receiving a zero on the plagiarized assignment. The second incident of plagiarism will result in the student’s receiving a zero in the class.

Disability Resources
If you have already established accommodations with Disability Resources for Students (DRS), please communicate your approved accommodations to uw at your earliest convenience so we can discuss your needs in this course.

If you have not yet established services through DRS, but have a temporary health condition or permanent disability that requires accommodations (conditions include but not limited to; mental health, attention-related, learning, vision, hearing, physical or health impacts), you are welcome to contact DRS at 206-543-8924 or uwdrs@uw.edu or disability.uw.edu. DRS offers resources and coordinates reasonable accommodations for students with disabilities and/or temporary health conditions. Reasonable accommodations are established through an interactive process between you, your instructor(s) and DRS. It is the policy and practice of the University of Washington to create inclusive and accessible learning environments consistent with federal and state law.

Other Student Support
Any student who has difficulty affording groceries or accessing sufficient food to eat every day, or who lacks a safe and stable place to live, and believes this may affect their performance in the course, is urged to contact the graduate program advisor for support. Furthermore, please notify the professors if you are comfortable in doing so. This will enable us to provide any resources that we may possess (adapted from Sara Goldrick-Rab). Please also note the student food pantry, Any Hungry Husky at the ECC.

Credit and Notes
This will be third time I have taught this course at UW in its current form. This syllabuses draws heavily from previous versions. Syllabuses from earlier classes can be found online at:


 * Internet Research Methods (Spring 2016)
 * Internet Research Methods (Spring 2015)

This syllabus was inspired by, and borrows with permission from, a syallbus from an earlier version of this class taught by Kirsten Foot. Professor Foot last taught the course in Spring 2014.