Internet Research Methods (Spring 2016)


 * Designing Internet Research
 * COM528 - Department of Communication, University of Washington
 * Instructor: Benjamin Mako Hill (University of Washington)
 * Course Websites:
 * We will use Canvas for announcements, turning in assignments, and discussion
 * Everything else will be linked on this page.
 * Course Catalog Description:


 * Focuses on designing Internet research, assessing the adaptation of proven methods to Internet tools and environments, and developing new methods in view of particular capacities and characteristics of Internet applications. Legal and ethical aspects of Internet research receive ongoing consideration.

Overview and Learning Objectives
What new lines of inquiry and approaches to social research are made possible and necessary by the Internet? In what ways have established research methods been affected by the Internet? How does the Internet challenge established methods of social research? How are researchers responding to these challenges?

These are some of the key questions we will explore in this course. The course will focus on assessing the incorporation of Internet tools in established and emergent methods of social research, the adaptation of social research methods to study online phenomena, and the development of new methods and tools that correspond with the particular capacities and characteristics of the Internet. The readings will include both descriptions of Internet-related research methods with an eye to introducing skills and examples of studies that use them. The legal and ethical aspects of Internet research will receive ongoing consideration throughout the course. The purpose of this course is to help prepare students to design high quality research projects that use the Internet to study online communicative, social, cultural, and political phenomena.

I will consider the course a complete success if every student is able to do all of these things at the end of the quarter:


 * Discuss and compare distinct types of Internet research including: web archiving; textual analysis; ethnography; interviews; network analyses of social and hyperlink networks; analysis of digital trace data, traditional, natural, and field experiments; design research; interviewing; survey research; design research; and narrative and visual analyses.
 * Describe particular challenges and threats to research validity associated with each method.
 * For at least one method, be able to provide a detailed description of a research project and feel comfortable embarking on a formative study using this methodology.
 * Given a manuscript (e.g., in the context of a request for peer review), be able to evaluate a Internet-based study in terms of its use its methodological choices.
 * Use a modern programming language (e.g., Python) to collect a dataset from a web API like the APIs from Twitter and Wikipedia.

Note About This Syllabus
You should expect this syllabus to be a dynamic document and you will notice that there are a few places marked "To Be Determined." Although the core expectations for this class are fixed, the details of readings and assignments will shift. As a result, there are three important things to keep in mind:


 * 1) Although details on this syllabus will change, I will not change readings or assignments less than one week before they are due. If I don't fill in a "To Be Determined" one week before it's due, it is dropped. If you plan to read more than one week ahead, contact me first.
 * 2) Closely monitor your email or the announcements section on the course website on Canvas. When I make changes, these changes will be recorded in the history of this page so that you can track what has changed and I will summarize these changes in an announcement on Canvas that will be emailed to everybody in the class.
 * 3) I will ask the class for voluntary anonymous feedback frequently — especially toward the beginning of the quarter. Please let me know what is working and what can be improved. In the past, I have made many adjustments based on this feedback.

Books
This class has no textbook and I am not requiring you to buy any books for this class. That said, several required readings and many suggested readings, will come from several excellent books which you might want consider adding to your library:

These books include:


 * 1) Hesse-Biber, S. N. (Ed.). (2011). The Handbook of Emergent Technologies in Social Research (1st edition). New York: Oxford University Press.
 * 2) Rogers, R. (2013). Digital Methods. Cambridge, Massachusetts: The MIT Press.
 * 3) Ackland, R. (2013). Web Social Science. SAGE Publications Ltd.
 * 4) Hargittai, E., & Sandvig, C. (2015). Digital Research Confidential: The Secrets of Studying Behavior Online (1 edition). The MIT Press.

Technical Skills
Nearly all of our structured in-person meetings and all of our readings will focus on teaching conceptual skills related to Internet research. These skills involve the "softer" skills of understanding, designing, and critiquing research plans. These are harder to teach, evaluate, and learn but are ultimately what will make a research project interesting, useful, or valid. When the course has been taught in the past by other faculty, it has been entirely focused on these types of conceptual skills.

That said, I also believe that any skilled Internet researcher must be comfortable writing code to collect a dataset from the web or, at the very least, should have enough experience doing so that they know what is involved and what is possible and impossible. This is essential even if your only goals is to manage somebody else writing code and gathering data or work productively with a collaborator who is doing so. As a result, being successful in this class will also require technical skills.

Because students are going to come to the class with different technical skillsets, we well be devoting a relatively small chunk of class time to developing technical skills. Instead, I'm requiring that students build these skills outside of our meetings together if they do not have them already.

In particular, I want every student to have the following three things:


 * 1) Basic skills in a general purpose high-level programming language used for Internet-based data collection and analysis. I strongly recommend the Python programming language although other programming languages like Ruby and Perl are also good choices. Generally speaking, statistical programming languages like R, Stata, Matlab are not well suited for this.
 * 2) Familiarity with the technologies of web APIs. In particular, students should understand what APIs are, how they work, and should be able to read, interpret, and process data in JSON.
 * 3) Knowledge of how to process and and move data from a website or API into a format that they will be able to use for analysis. The final format will depend on the nature of the result but this might be a statistical programming environment like R, Stata, SAS, SPSS, etc or a qualitative data analysis tools like ATLAS.ti, NVivo or Dedoose.

If you are already comfortable doing these things, great.

If you are not yet comfortable, I am going to be organizing three free workshops called the Community Data Science Workshops on Saturdays in April and May and I extremely strongly recommend that you attend them. The workshops will teach exactly the skills I'm expecting you to have and attending the full series of workshops will be enough to fulfill this requirement.

The workshops will meet four times so please block these out on your calendar now:


 * 1) Friday 4/8 6-9pm
 * 2) Saturday 4/9 9:45am-4pm
 * 3) Saturday 4/23 9:45am-4pm
 * 4) Saturday 5/7 9:45am-4pm

I've offered this workshops four times previously and they have always been oversubscribed. As a result, you should register for these workshops immediately. You can find the registration link on this page. Please mention that you are in this class when you register so that we make sure that you accept your application.

I have taught these workshops four times before in 2014 and 2015. If you have taken them in the past, you do not need to take them again. If you took them before but are feeling unsure about your skills, you will be welcome to come back to review and brush up on the material.

If you do not have the technical skills required above and you will not attend the workshops, you're going to be responsible for learning this material on your own. Although this is totally fine, I suspect it present a major challenge to success in this class. If you will be in this situation, contact me before the quarter starts.

Assignments
The assignments in this class are designed to give you an opportunity to try your hand at using the conceptual material taught in the class. There will be no exams or quizzes. Unless otherwise noted, all assignments are due at the end of the day (i.e., 11:59pm on the day they are due).

Method Presentation
Related to participation, every student will be assigned a research method and asked to investigate how it is being adapted to or developing within Internet studies and to report on these results in a new Wikipedia article or in a major revision of a existing article.

The article should include several links to, and examples of, the method from published literature, an assessment of the potential affordances and constraints of this method for Internet research, a neutral and even-handed critique of some of the ways it has been employed in Internet research to date, and a list of references. All of these should be formatted according to Wikipedia policies.

Links to articles will be distributed ahead of class and all students will be expected to read them before we meet.

Wikipedia Task #1 - Create an account and Wikipedia orientation

 * Due: April 8
 * Deliverables: Make contributions in Wikipedia


 * Finish the online student orientation for our Wikipedia course. During this training, you will create an account, make edits in a sandbox, and learn the basic rules of the Wikipedia community.
 * Create a user page, and sign up on the list of students on the course dashboard.
 * To practice editing and communicating on Wikipedia, introduce yourself to me and at least one classmate on Wikipedia.

Wikipedia Task #2 - Draft of Wikipedia Article

 * Maximum Length: 2000 words
 * Deliverables: Make contributions in Wikipedia and share link in Canvas discussion
 * Due Date: 11:59 on the day before the class session in which we will discuss the method


 * Compile a bibliography of relevant research.
 * Write or expand a Wikipedia article on the method you have selected — with citations — in your Wikipedia sandbox.
 * Add your sandboxed article to the class's course page with the template.

Wikipedia Task #3 - Finalize and Peer Review Your Classmates Articles

 * Deliverables: Make contributions in Wikipedia
 * Due Date: June 12


 * Move sandbox articles into the main namespace.
 * Peer review two of your classmates' articles. Leave suggestions on the article talk pages.
 * Copy-edit the two reviewed articles.
 * Make edits to your article based on peers' feedback. If you disagree with a suggestion, use talk pages to politely discuss and come to a consensus on your edit.

Discussion Facilitation

 * Due Date: Class session in which we will discuss the method


 * In addition to the essay, you will be responsible for facilitating the discussion of your assigned method in class. This means you should come prepared with questions and notes.

Research Project
As a demonstration of your learning in this course, you will design a plan for an internet research project and will, if possible, also collect (at least) an initial sample of a dataset that you will use to complete the project.

The genre of the paper you can produce can one of the following three things:


 * 1) A draft of a manuscript for submission to a conference or journal.
 * 2) A proposal for funding (e.g., for submission for the NSF for a graduate student fellowship).
 * 3) A draft of the methods chapter of your dissertation.

In any the three paths, I expect you take this opportunity to produce a document that will further your to academic career outside of the class.

Project Identification

 * Due Date: April 10
 * Maximum paper length: 500 words (~1-2 page)
 * Deliverables: Turn in in Canvas

Early on, I want you to identify your final project. Your proposal should be short and can be either paragraphs or bullets. It should include the following things:


 * The genre of the project and a short description of how it fits into your career trajectory.
 * A one paragraph abstract of the proposed study and research question, theory, community, and/or groups you plan to study.
 * A short description of the type of data you plan to collect as part of your final project.

Final Project

 * Outline Due Date: May 8
 * Maximum outline length: 2 pages
 * Paper Due Date: June 12
 * Maximum outline length: 6000 words (~20 pages)
 * Presentation Date: June 2
 * All Deliverables: Turn in in Canvas

Because the emphasis in this class is on methods and because I'm not an expert in each of your areas or fields, I'm happy to assume that your paper, proposal, or thesis chapter has already established the relevance and significance of your study and has a comprehensive literature review, well-grounded conceptual approach, and compelling reason why this research is so important. Instead of providing all of this details, instead feel free to start with a brief summary of the purpose and importance of this research, and an introduction of your research questions or hypotheses. If your provide more detail, that's fine, but I won't give you detailed feedback on this parts.

The final paper should include:


 * a statement of the purpose, central focus, relevance and significance of this research;
 * a description of the specific Internet application(s) and/or environment(s) and/or objects to be studied and employed in the research;
 * key research questions or hypotheses;
 * operationalization of key concepts;
 * a description and rationale of the specific method(s), (if more than one method will be used, explain how the methods will produce complementary findings);
 * a description of the step-by-step plan for data collection;
 * description and rationale of the level(s), unit(s) and process of analysis (if more than one kind of data are generated, explain how each kind will be analyzed individually and/or comparatively);
 * an explanation of how these analyses will enable you to answer the RQs
 * a sample instrument (as appropriate);
 * a sample dataset and description of a formative analysis you have completed;
 * a description of actual or anticipated results and any potential problems with their interpretation;
 * a plan for publishing/disseminating the findings from this research
 * a summary of technical, ethical, human subjects and legal issues that may be encountered in this research, and how you will address them;
 * a schedule (using specific dates) and proposed budget.

I also expect each student to begin data for your project (i.e., using the technical skills you learn in the class) and describe your progress in this regard this in your paper. If collecting data for a proposed project is impractical (e.g., because of IRB applications, funding, etc) I would love for you to engage in the collection of public dataset as part of a pilot or formative study. If this is not feasible or useful, we can discuss other options.

I have a strong preference for you to write this paper individually but I'm open to the idea that you may want to work with others in the class.

Participation
The course relies heavily on participation and discussion. It is important to realize that we will not summarize reading in class and I will not cover it in lecture. I expect you all to have read it and we will jump in and start discussing it. The "Participation Rubric" section of my detailed page on assessment gives the rubric I will use in evaluating participation.

Grading
I have put together a very detailed page that describes grading rubric I will be using in this course. Please read it carefully I will assign grades for each of following items on the UW 4.0 grade scale according to the weights below:


 * Participation: 25%
 * Presentation of method/approach: 15%
 * Proposal identification: 5%
 * Final paper outline: 5%
 * Final Presentation: 10%
 * Final Paper: 40%

Week 1: Monday March 28: Introduction and Framing
Resources:


 * Week 1 Reading Note — Read this first!

Required Readings:


 * Agre, Philip, “Internet Research: For and Against.” in Mia Consalvo, Nancy Baym, Jeremy Hunsinger, Klaus Bruhn Jensen, John Logie, Monica Murero, and Leslie Regan Shade, eds, Internet Research Annual, Volume 1: Selected Papers from the Association of Internet Researchers Conferences 2000-2002, New York: Peter Lang, 2004. [Free Online]
 * Sandvig, Christian, 2010, "Why the Internet is On the Verge of Blowing Up All Our Methods Courses." [Free Online]
 * Lazer, D., Pentland, A., Adamic, L., Aral, S., Barabasi, A. L., Brewer, D., … Van Alstyne, M. (2009). Computational Social Science. Science, 323(5915), 721–723. [Available through UW Libraries]

Optional Reading:


 * Gane, Nicholas, and Beer, David, 2008, "Introduction: Concepts and Media." from New Media: The Key Concepts, Berg, pp. 1-13. [Available in Canvas]
 * Bruhn Jensen, Klaus, 2011, "New media, old methods — Internet methodologies and the online/offline divide," in Consalvo & Ess (Eds.), The Handbook of Internet Studies, Blackwell, pp. 43-58. [Available in Canvas]
 * Hesse-Biber, Sharlene Nagy, "Emergent Technologies in Social Research: Pushing Against the Boundaries of Research Praxis," [HET], pp. 3-24. [Available in Canvas]
 * December, John. (March, 1996). "Units of Analysis for Internet Communication." Journal of Computer-Mediated Communication, V.1, N.4. [Available through UW libraries]
 * Steven M. Schneider & Kirsten A. Foot, "The Web as an Object of Study." New Media and Society, V. 6, N.1, 114-122, 2004. [Free Online]
 * Gunkel, David, "To Tell the Truth: The Internet and Emergent Epistemological Challenges in Social Research," [HET], pp. 47-64. [Available in Canvas]
 * Baym, Nancy. (2006). "Finding the Quality in Qualitative Internet Research," in Critical Cyberculture Studies, David Silver and Adrienne Massanari, eds., New York University Press, NY. pp. 79-87. [Available in Canvas]
 * Hackett, Edward, "Possible dreams: Research technologies and the transformation of the human sciences," Ch 1 in HET. [Available in Canvas]

Week 1: Wednesday March 30: Ethics
Resources:


 * Week 1 Reading Note — Read this first!

Required Readings:


 * Association of Internet Researchers, Ethics Working Committee, 2011, “Ethics Guidelines Review Draft.” [Free Online] (Browseable Web Version)
 * Kramer, A. D. I., Guillory, J. E., & Hancock, J. T. (2014). Experimental evidence of massive-scale emotional contagion through social networks. Proceedings of the National Academy of Sciences, 111(24), 8788–8790. [Available through UW Libraries]
 * [Look Over Briefly] Grimmelmann, James. (2014) The Facebook Emotional Manipulation Study: Sources. [Free Online]
 * Carr, N. (2014, September 14). The Manipulators: Facebook’s Social Engineering Project. Retrieved March 26, 2015. [Free Online]
 * Bernstein, M. (2014, July 7). The Destructive Silence of Social Computing Researchers. Retrieved March 26, 2015. [Free Online]
 * Lampe, C. (2014, July 8). Facebook Is Good for Science. [Free Online]
 * Monroy-Hernández, Andrés, and Benjamin Mako Hill. (2016) "The Scratch Online Community Dataset." Working Paper. Seattle, Washington. — Read at least the Background & Summary, Setting, Defining Public Data, and Research Ethics sections and skim the rest. Please also read the
 * Scratch Data Sharing Agreement (Draft as of 2016-03-28)

Optional Readings:


 * The Belmont Report. (1979).
 * American Association for the Advancement of Science, 1999, “Ethical and Legal Aspects of Human Subjects Research in Cyberspace.” [Free Online]
 * Digital Millenium Copyright Act and these explanatory/commentary essays & sites:
 * The Electronic Frontier Foundation's page on the DMCA.
 * Templeton, Brad's A Brief Intro to Copyright & 10 Big Myths about Copyright Explained
 * Sections on Copyright, Privacy, and Social Media in the “Internet Case Digest” of the Perkins Coie LLP “Case Digest” site.
 * Amy Bruckman's two 2016 blog posts about researchers violating terms of Service (TOS) while doing academic research: Do Researchers Need to Abide by Terms of Service (TOS)? An Answer. and More on TOS: Maybe Documenting Intent Is Not So Smart
 * Narayanan, A., & Shmatikov, V. (2008). Robust De-anonymization of Large Sparse Datasets. In IEEE Symposium on Security and Privacy, 2008. SP 2008 (pp. 111–125). http://doi.org/10.1109/SP.2008.33 [Available through UW Libraries]
 * Markham, A. (2012). Fabrication as Ethical Practice. Information, Communication & Society, 15(3), 334–353. [Available through UW Libraries]
 * Trevisan, F., & Reilly, P. (2014). Ethical dilemmas in researching sensitive issues online: lessons from the study of British disability dissent networks. Information, Communication & Society, 17(9), 1131–1146. [Available through UW Libraries]
 * Bond, R. M., Fariss, C. J., Jones, J. J., Kramer, A. D. I., Marlow, C., Settle, J. E., & Fowler, J. H. (2012). A 61-million-person experiment in social influence and political mobilization. Nature, 489(7415), 295–298. [Available through UW Libraries]
 * Bruckman, A., Luther, K., & Fiesler, C. (2015). When Should We Use Real Names in Published Accounts of Internet Research? In Digital Research Confidential: The Secrets of Studying Behavior Online (pp. 243–259). Cambridge, Massachusetts: MIT Press.

Week 2: Wednesday April 6: Web Archiving
Facilitator: Janny

Required Readings:


 * Shumate, M., & Weber, M. S. (2015). The Art of Web Crawling for Social Science Research. In E. Hargittai & C. Sandvig (Eds.), Digital Research Confidential: The Secrets of Studying Behavior Online (1 edition). The MIT Press. [Available in Canvas]
 * Schneider, Steven, Kirsten Foot, and Paul Wouters, 2009, “Web Archiving as E-Research,” in e-Research: Transformation in Scholarly Practice, Nicholas Jankowski (Ed.), Routledge, pp. 205-221. [Available in Canvas]
 * Brügger, N. (2011). Web archiving—Between past, present, and future. In M. Consalvo & C. Ess (Eds.), The Handbook of Internet Studies (pp. 24–42). Chichester, West Susssex: Blackwell. [Available in Canvas]
 * Rogers, Richard, Chapter 3 "The Website as Archived Object" from Digital Methods, pp. 61-82. [Available through Canvas]
 * Graeff, E., Stempeck, M., & Zuckerman, E. (2014). The battle for “Trayvon Martin”: Mapping a media controversy online and off-line. First Monday, 19(2). [Free Online]

Optional Readings:


 * Gherab-Martin, Karim, "Digital repositories, folksonomies, and interdisciplinary research: New social epistemology tools," Ch. 10 in HET. [Available in Canvas]
 * Digital Methods Initiative. (2009). The Spheres. [Free Online]
 * Rogers, Richard, Chapter 4 "Googlization and the Inculpable Search Engine" from Digital Methods. [Available through Canvas]
 * Schneider, S. M., & Foot, K. A. (2004). The Web as an Object of Study. New Media & Society, 6(1), 114–122. [Available through UW Libraries]
 * Spaniol, M., Denev, D., Mazeika, A., Weikum, G., & Senellart, P. (2009). Data Quality in Web Archiving. In Proceedings of the 3rd Workshop on Information Credibility on the Web (pp. 19–26). New York, NY, USA: ACM. [Available through UW Libraries]
 * Archive Team is an online community that archives websites. They are a fantastic resource and include many pieces of detailed technical documentation on the practice of engaging in web archiving. For example, here are detailed explanations of mirroring a website with GNU wget which is the piece of free software I usually use to archive websites.
 * Weber, M. S. (2014). Observing the Web by Understanding the Past: Archival Internet Research. In Proceedings of the Companion Publication of the 23rd International Conference on World Wide Web Companion (pp. 1031–1036). Republic and Canton of Geneva, Switzerland: International World Wide Web Conferences Steering Committee. [Available through UW Libraries]

Week 2: Friday April 8: CDSW Session 0
As description in the section on technical skills above, I expect everybody who is not comfortable with at least basic programming and data collection to attend the Community Data Science Workshops (Spring 2016) which I am running concurrently with this class.

This session will run from 6-9pm and is the only session which can probably be missed. Please do contact me, however, if you will not be able to attend it.

Week 2: Saturday April 9: CDSW Session 1
As description in the section on technical skills above, I expect everybody who is not comfortable with at least basic programming and data collection to attend the Community Data Science Workshops (Spring 2016) which I am running concurrently with this class.

This session will run from 9am-3pm. Details on the CDSW Spring 2016 page.

Week 3: Monday April 11: Textual Analyses
Facilitator: Adam

Required Readings:


 * McMillan, S. J. (2000). The microscope and the moving target: The challenge of applying content analysis to the World Wide Web. Journalism and Mass Communication Quarterly, 77(1), 80-98. [Available through UW Libraries]
 * Mishne, Gilad and Natalie Glance (2006), “Leave a reply: An analysis of weblog comments.” Third Annual Conference on the Weblogging Ecosystem, at WWW 2006. [Free Online]
 * Grimmer, J., & Stewart, B. M. (2013). Text as Data: The Promise and Pitfalls of Automatic Content Analysis Methods for Political Texts. Political Analysis. [Available through UW Libraries]
 * DiMaggio, P., Nag, M., & Blei, D. (2013). Exploiting affinities between topic modeling and the sociological perspective on culture: Application to newspaper coverage of U.S. government arts funding. Poetics, 41(6), 570–606. [Available through UW Libraries]

Optional Readings:

I'm assuming you have at least a rough familiarity with content analysis as a methodology. If your not as comfortable with this, check out the Wikipedia article to start. These help provide more of a background into content analysis (in general, and online):


 * Van Selm, Martine & Jankowski, Nick, (2005) "Content Analysis of Internet-Based Documents." Unpublished Manuscript. [Available in Canvas]
 * Neuendorf, K. A. (2002). The content analysis guidebook. Thousand Oaks, Calif.: Sage Publications. [Available from Instructor]
 * Krippendorff, K. (2005). Content analysis: an introduction to its methodology. Thousand Oaks; London; New Delhi: Sage. [Available from Instructor]

Examples of more traditional content analysis using online content:


 * Trammell, K. D., Tarkowski, A., Hofmokl, J., & Sapp, A. M. (2006). Rzeczpospolita blogów (Republic of Blog): Examining Polish Bloggers Through Content Analysis. Journal of Computer-Mediated Communication, 11(3), 702–722. [Free Online]
 * Woolley, J. K., Limperos, A. M., & Oliver, M. B. (2010). The 2008 Presidential Election, 2.0: A Content Analysis of User-Generated Political Facebook Groups. Mass Communication and Society, 13(5), 631–652. [Available from UW Libraries]'

Another example of topic modeling, but from political science:


 * Barberá, P., Bonneau, R., Egan, P., Jost, J. T., Nagler, J., & Tucker, J. (2014). Leaders or Followers? Measuring Political Responsiveness in the US Congress Using Social Media Data. Presented at the Annual Meeting of the American Political Science Association. [Free Online]

Week 3: Wednesday April 13: Digital Ethnography & Trace Ethnography
Session Coordinator: Nate

Required Readings:

Optional Readings:
 * Robinson, Laura and Jeremy Schulz, "New fieldsites, new methods: New ethnographic opportunities," Ch. 8 in HET. [Available in Canvas]
 * [Selections] Jemielniak, D. (2014). Common Knowledge?: An Ethnography of Wikipedia. Stanford, California: Stanford University Press. "Introduction" and "Appendix A: Methodology." [Available in Canvas]
 * Geiger, R. S., & Ribes, D. (2010). The Work of Sustaining Order in Wikipedia: The Banning of a Vandal. In Proceedings of the 2010 ACM Conference on Computer Supported Cooperative Work (pp. 117–126). New York, NY, USA: ACM. [Available through UW Libraries]
 * Geiger, R. S., & Ribes, D. (2011). Trace Ethnography: Following Coordination Through Documentary Practices. In Proceedings of the 2011 44th Hawaii International Conference on System Sciences (pp. 1–10). Washington, DC, USA: IEEE Computer Society. [Available through UW Libraries]


 * Coleman, E. G. (2010). Ethnographic Approaches to Digital Media. Annual Review of Anthropology, 39(1), 487–505. [Available through UW Libraries]
 * Response by danah boyd To Hine's "Question One: How Can Qualitative Internet Researchers Define the Boundaries of Their Projects?" from Internet Inquiry: Conversations About Method, Annette Markham and Nancy Baym (Eds.), Sage, 2009, pp. 1-32. [Available in Canvas]
 * Note: You may also be interest in reading the essay by Hine that boyd is responding to. [Available in Canvas]

This is the canonical book-length account and the main citation in this space:


 * Hine, C. (2000). Virtual ethnography. London, UK: SAGE Publications. [Available from Instructor]

These are all other interesting and/or frequently cited examples of Internet-based ethnographies:


 * Humphreys, L. (2007). Mobile Social Networks and Social Practice: A Case Study of Dodgeball. Journal of Computer-Mediated Communication, 13(1), 341–360. [Available through UW Libraries]
 * Note: Dodgeball is a mobile social network system (MSNS) that allows groups of friends to connect and meet up via mobile phone. The author employed participant observation in order to understand norms of interaction in the MSNS "space".


 * Brotsky, S. R., & Giles, D. (2007). Inside the “Pro-ana” Community: A Covert Online Participant Observation. Eating Disorders, 15(2), 93–109. [Available through UW Libraries]
 * Note: To conduct the study reported in this paper the authors created a used a fake profile in order to observe the psychological support offered to participants.


 * Williams, M. (2007). Avatar watching: participant observation in graphical online environments. Qualitative Research, 7(1), 5–24. [Available through UW Libraries]
 * Note: Fantastic more general introduction but takeaways that are more specifically targetted toward people studying virtual reality type environments with virtual physicality.

Apropos of class discussion:


 * Borges, J. L. (1998). Pierre Mendard, author of the Quixote. In A. Hurley (Trans.), Collected Fictions (pp. 88–95). New York, N.Y., U.S.A: Viking Press. [Available in Canvas]
 * Maxwell, J. A. (2002). Understanding and validity in qualitative research. In A. M. Huberman & M. B. Miles (Eds.), The Qualitative Researcher’s Companion (pp. 37–64). SAGE. [Available in Canvas]

Week 4: Monday April 18: Online Interviews
Facilitator: Julia

Required Readings:


 * O’Connor, H., Madge, C., Shaw, R., & Wellens, J. (2008). Internet-based Interviewing. In N. G. Fielding, R. M. Lee, & G. Blank (Eds.), The SAGE Handbook of Online Research Methods (pp. 271–289). London, UK: SAGE Publications, Ltd. [Available through UW Libraries]
 * Stewart, K., & Williams, M. (2005). Researching online populations: the use of online focus groups for social research. Qualitative Research, 5(4), 395–416.
 * Hanna, P. (2012). Using internet technologies (such as Skype) as a research medium: a research note. Qualitative Research, 12(2), 239–242. [Available through UW Libraries]
 * Note: Short article you can basically skim. Read it quickly so you can cite it later.


 * Dowling, S. (2012). Online Asynchronous and Face-to-Face Interviewing: Comparing Methods for Exploring Women’s Experiences of Breastfeeding Long Term. In Salmons, J. (Ed.), Cases in Online Interview Research (pp. 277–303). 2455 Teller Road,  Thousand Oaks  California  91320  United States: SAGE Publications, Inc. [Available through UW Libraries]

Optional Readings:


 * boyd, danah. (2015). Making sense of teen life: Strategies for capturing ethnographic data in a networked era. In E. Hargittai & C. Sandvig (Eds.), Digital Research Confidential: The Secrets of Studying Behavior Online. Cambridge, Massachusetts: The MIT Press. [Available in Canvas]
 * Note: Strongly focused on enthnographic interviews with tons of very specific details. Fantastic article on interviewing, although perhaps a bit weak on Internet specific advice.


 * Markham, Annette (1998), "The Shifting Project, the Shifting Self," from Life Online, Altamira Press, 1998, pp. 61-84. [Available in Canvas]
 * Note: One of the earliest books on online life and one of the earliest attempts to do online interviewing. This is dated, but highlight some important challenge.


 * Stromer-Galley, Jennifer (2003), "Depth Interviews for the Study of Motives and Perceptions of Internet Use," International Communication Association, San Diego, May. [Available in Canvas]
 * Note: Start reading on page 8 on "The Internet and the Interview". The beginning is a theoretical argument that's not really relevant to this class.* Chou, C. (2001). Internet heavy use and addiction among Taiwanese college students: an online interview study. CyberPsychology & Behavior, 4(5), 573-585. [Available through UW Libraries]

Alternate Accounts:

These texts are largely redundant to the required texts above but do provide a different perspective and examples:


 * Salmons, J. (2014). Qualitative Online Interviews: Strategies, Design, and Skills. SAGE Publications.
 * This is a book that lays out what claims to be a comprehensive account to online interviewing. Take a quick through the preface and table of contents and read Chapter 1. [Both Available in Canvas.]
 * I have the book and am happy to loan my copy to anybody in the class that thinks this will be a core part of their research.


 * Morgan, David L. and Bojana Lobe, "Online focus groups," Ch. 9 in HET. [Available in Canvas]
 * Gaiser, T. J. (2008). Online Focus Groups. In N. G. Fielding, R. M. Lee, & G. Blank (Eds.), The SAGE Handbook of Online Research Methods (pp. 290–307). London, UK: SAGE Publications, Ltd. [Available through UW Libraries]

Week 4: Wednesday April 20: Social Network Analysis
Faciliator: Mengjun

Required Readings:


 * Garton, Laura, Caroline Haythornthwaite, and Barry Wellman, "Studying Online Social Networks," Journal of Computer-Mediated Communication, V. 3, N. 1, June, 1997. [Free Online]
 * Mislove, Alan, et al (2007), "Measurement and Analysis of Online Social Networks," IMC 2007, October 24-27, San Diego, CA [Available through UW Libraries]
 * Howard, Phil, "Network Ethnography and Hypermedia Organization: New Organizations, New Media, New Myths," New Media and Society, December 2002, 4(4), pp. 550-574. [Available through UW Libraries]
 * Keegan, B., Gergle, D., & Contractor, N. (2013). Hot Off the Wiki Structures and Dynamics of Wikipedia’s Coverage of Breaking News Events. American Behavioral Scientist, 57(5), 595–622. [Available through UW Libraries]

Week 4: Saturday April 23: CDSW Session 2
As description in the section on technical skills above, I expect everybody who is not comfortable with at least basic programming and data collection to attend the Community Data Science Workshops (Spring 2016) which I am running concurrently with this class.

This session will run from 10am-4pm. Details on the CDSW Spring 2016 page.

Week 5: Monday April 25: Experiments
Facilitator: Emma

Required Readings:


 * Reips, U.-D. (2002). Standards for Internet-based experimenting. Experimental Psychology, 49(4), 243–256. [Alternate Link]
 * Hergueux, J., & Jacquemet, N. (2014). Social preferences in the online laboratory: a randomized experiment. Experimental Economics, 18(2), 251–283. [Available through UW Libraries]
 * Salganik, M. J., Dodds, P. S., & Watts, D. J. (2006). Experimental Study of Inequality and Unpredictability in an Artificial Cultural Market. Science, 311(5762), 854–856. [Available through UW Libraries]
 * Rijt, A. van de, Kang, S. M., Restivo, M., & Patil, A. (2014). Field experiments of success-breeds-success dynamics. Proceedings of the National Academy of Sciences, 111(19), 6934–6939. [Available through UW Libraries] [Alternative Link]
 * Bond, R. M., Fariss, C. J., Jones, J. J., Kramer, A. D. I., Marlow, C., Settle, J. E., & Fowler, J. H. (2012). A 61-million-person experiment in social influence and political mobilization. Nature, 489(7415), 295–298. [Available through UW Libraries]

Optional Readings:


 * Zhu, H., Zhang, A., He, J., Kraut, R., & Kittur, A. (2013). Effects of Peer Feedback on Contribution: A Field Experiment in Wikipedia. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. Paris, France: ACM. [Available through UW Libraries]
 * Restivo, M., & van de Rijt, A. (2012). Experimental Study of Informal Rewards in Peer Production. PLoS ONE, 7(3), e34358. [Free Online]
 * This is really just a more in-depth version of the experiments in the Restivo and van de Rijt article described above.


 * Restivo, M., & van de Rijt, A. (0). No praise without effort: experimental evidence on how rewards affect Wikipedia’s contributor community. Information, Communication & Society, 0(0), 1–12. [Available through UW Libraries]
 * Note: This piece is, more or less, a continuation of the Restivo and van de Rijt piece included above but it is longer and goes into much more depth on at least one of the important theoretical issues.


 * Kramer, A. D. I., Guillory, J. E., & Hancock, J. T. (2014). Experimental evidence of massive-scale emotional contagion through social networks. Proceedings of the National Academy of Sciences, 111(24), 8788–8790. [Available through UW Libraries]
 * Note: We've already read but I'd like to discuss it again.


 * Cosley, D., Frankowski, D., Terveen, L., & Riedl, J. (2007). SuggestBot: Using Intelligent Task Routing to Help People Find Work in Wikipedia. In Proceedings of the 12th International Conference on Intelligent User Interfaces (pp. 32–41). New York, NY, USA: ACM. [Available through UW Libraries]
 * Reinecke, K., & Gajos, K. Z. (2015). LabintheWild: Conducting Large-Scale Online Experiments With Uncompensated Samples. In Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing (pp. 1364–1378). New York, NY, USA: ACM. [Available through UW Libraries]

Week 5: Wednesday April 27: Surveys
Facilitator: Ben

Required Readings:


 * Van Selm, Martine & Nicholas Jankowski (2006), "Conducting Online Surveys," Quality and Quantity, 40: 435-456. [Available through UW Libraries]
 * Walejko, Gina, "Online survey: Instant publication, instant mistake, all of the above," from Research Confidential: Solutions to Problems Most Research Scientists Pretend They Never Have, University of Michigan Press, 2009, pp. 101-121. [Available in Canvas]
 * Joseph A. Konstan, B. R. Simon Rosser, Michael W. Ross, Jeffrey Stanton, & Weston M. Edwards, “The Story of Subject Naught: A Cautionary but Optimistic Tale of Internet Survey Research,” Journal of Computer-Mediated Communication, V.10, N. 2, January 2005. [Free Online]
 * Hill, B. M., & Shaw, A. (2013). The Wikipedia Gender Gap Revisited: Characterizing Survey Response Bias with Propensity Score Estimation. PLoS ONE, 8(6), e65782. [Free Online]
 * Salganik, M. J., & Levy, K. E. C. (2015). Wiki Surveys: Open and Quantifiable Social Data Collection. PLOS ONE, 10(5), e0123483. [Free Online]
 * Note: This journalistic account of the research may also be useful.

Optional Readings:

If you don't have a background in survey design, these two have been recommended by our guest speaker as good basic things to read:


 * Krosnick, J. A. (1999). Survey Research. Annual Review of Psychology, 50(1), 537–567. [Available through UW Libraries]
 * Krosnick, J. A. (1999). Maximizing measurement quality: Principles of good questionnaire design. In J. P. Robinson, P. R. Shaver, & L. S. Wrightsman (Eds.), Measures of Political Attitudes. New York: Academic Press.

These are other texts on the subject that you might find useful:


 * Dal, Michael, "Online data collection and data analysis using emergent technologies," Ch. 12 in HET. [Available in Canvas]
 * Smith, Tom W. and John Sokolowski, "The use of audiovisuals in surveys," Ch. 19 in HET. [Available in Canvas]
 * Kellock, Anne, et. al. "Using technology and the experience sampling method to understand real life," Ch. 24 from HET. [Available in Canvas]
 * Yun, Gi Woong and Craig Trumbo, "Comparative Response to a Survey Executed by Post, E-mail and Web Form," Journal of Computer-Mediated Communication, V.6, N.1, September, 2000. [Free Online]
 * Hargittai, Eszter, and Chris Karr, "WAT R U DOIN? Studying the Thumb Generation Using Text Messaging," from Research Confidential: Solutions to Problems Most Research Scientists Pretend They Never Have, University of Michigan Press, 2009, pp. 192-216. [Available in Canvas]
 * Wright, Kevin, "Researching Internet-Based Populations: Advantages and Disadvantages of Online Survey Research, Online Questionnaire Authoring Software Packages, and Web Survey Services," Journal of Computer-Mediated Communication, V. 10, N. 3, April 2005. [Free Online]

Week 6: Monday May 2: Narrative, Discourse and Visual Analysis
Facilitator: Liang

Required Readings:

Narrative Analysis:


 * Mitra, A. (1999). Characteristics of the WWW Text: Tracing Discursive Strategies. Journal of Computer-Mediated Communication, 5(1), 0–0. [Free Online]
 * Kaun, Anne (2010), "Open-Ended Online Diaries: Capturing Life as it is Narrated," International Journal of Qualitative Methods, Vol. 9 Issue 2, p133-148. [Free Online]

Visual Analysis:


 * Hochman, N., & Schwartz, R. (2012). Visualizing Instagram: Tracing Cultural Visual Rhythms. In Sixth International AAAI Conference on Weblogs and Social Media. [Available through UW Libraries]
 * Hochman, N., & Manovich, L. (2013). Zooming into an Instagram City: Reading the local through social media. First Monday, 18(7). [Free Online]

Optional Readings:

Narrative Analysis:


 * Gubrium, Aline and K.C. Nat Turner, "Digital storytelling as an emergent method for social research and practice," Ch. 21 in HET.

Visual Analysis:


 * Newbold, Curtis, 2013, "How to Do a Visual Analysis (A 5-Step Process)". [Free Online]
 * Note: Although I'm not a fan of infograpraphics as a genre, I suppose it makes sense that visual communication people would put together a pretty good one! If you're already familiar with visual analysis from the rhetorical tradition, there's not going to be a lot new here. If this is new for you, this will help you frame and understand the other readings.


 * Torralba, A. (2009). Understanding Visual Scenes. Tutorial presented at the NIPS, Vancouver, BC, Canada. Part I. [Free Online]
 * Note: This is a two part (each part is one hour) lecture and tutorial by a expert in computer vision. I strongly recommend watching Part I. I think this gives you a good sense of the nature of the kinds of challenges that were (and still are) facing the field of computer vision and anybody trying to have their computer look at images.

These five paper are all technical approaches to doing image classification using datasets from Internet-based datasets of images like Flickr, Google Image Search, Google Street View, or Instagram. Each of these describes interesting and challenges technical issues. If you're interested, it would be a great idea to read these to get a sense for the state of the art and what is and isn't possible:


 * Jaffe, A., Naaman, M., Tassa, T., & Davis, M. (2006). Generating Summaries and Visualization for Large Collections of Geo-referenced Photographs. In Proceedings of the 8th ACM International Workshop on Multimedia Information Retrieval (pp. 89–98). New York, NY, USA: ACM. [Available through UW Libraries]
 * Simon, I., Snavely, N., & Seitz, S. M. (2007). Scene Summarization for Online Image Collections. In Computer Vision, IEEE International Conference on (Vol. 0, pp. 1–8). Los Alamitos, CA, USA: IEEE Computer Society. [Free Online]
 * Crandall, D. J., Backstrom, L., Huttenlocher, D., & Kleinberg, J. (2009). Mapping the World’s Photos. In Proceedings of the 18th International Conference on World Wide Web (pp. 761–770). New York, NY, USA: ACM. [Available through UW Libraries]
 * San Pedro, J., & Siersdorfer, S. (2009). Ranking and Classifying Attractiveness of Photos in Folksonomies. In Proceedings of the 18th International Conference on World Wide Web (pp. 771–780). New York, NY, USA: ACM. [Available through UW Libraries]
 * Doersch, C., Singh, S., Gupta, A., Sivic, J., & Efros, A. A. (2012). What Makes Paris Look Like Paris? ACM Trans. Graph., 31(4), 101:1–101:9. [Available through UW Libraries]

Discourse Analysis:


 * Honeycutt, Courtenay (2005), “Hazing as a process of boundary maintenance in an online community”, Journal of Computer-Mediated Communication, 10(2). [Available through UW Libraries]
 * Note: Combines quantitative and qualitative computer-mediated discourse analysis methods.*

Week 6: Wednesday May 4: Crowdsourced Data Analysis and Experimentation
Assignment:


 * Find and complete at least 2 "hits" as a worker on Amazon Mechnical Turk. Note that to do this you will need to create a worker account on Mturk.
 * Record (write down) details and notes about your tasks: What did you do? Who was the requester? What could you was the purpose of the task (as best you could tell)? What was the experience like? What research applications can you (not) imagine for this kind of system?
 * Design and deploy a small-scale research task on Mturk. Note that to do this, you will need to create a requester account on Mturk. Be sure to allow some time to get the task design the way you want it! Some ideas for study designs you might do:
 * A small survey.
 * Classification of texts or images (e.g., label tweets, pictures, or comments from a discussion thread).
 * A small experiment (e.g., you can do a survey where you insert different images and ask the same set of questions. Check out the Mturk requester getting started guide
 * Prepare to share details of your small-scale research task in class, including results (they will come fast).

Note: In terms of running your task, it will cost real money and you have to put money on your Amazon account yourself. You've each got a $3 budget. Please use your credit card to put $3 on your account right away. I will pay each of you $3 in cash next week to reimburse you for the cost of running the experiment.

Required Readings:


 * Amazon Mechanical Turk Requester UI Guide [Free Online]
 * Amazon Mechanical Turk Best Practices Guide. [Free Online]
 * Weinberg, J., Freese, J., & McElhattan, D. (2014). Comparing Data Characteristics and Results of an Online Factorial Survey between a Population-Based and a Crowdsource-Recruited Sample. Sociological Science, 1, 292–310. [Free Online]
 * Shaw, A. (2015). Hired Hands and Dubious Guesses: Adventures in Crowdsourced Data Collection. In E. Hargittai & C. Sandvig (Eds.), Digital Research Confidential: The Secrets of Studying Behavior Online. The MIT Press. [Available in Canvas]

Optional Readings:


 * Gray, M. L., Suri, S., Ali, S. S., & Kulkarni, D. (2016). The Crowd is a Collaborative Network. Proceedings of Computer-Supported Cooperative Work. [Free Online]
 * Kittur et al. (2013). The Future of Crowd Work. Proceedings of Computer-Supported Cooperative Work. [Free Online]

Resources:
 * Mturk Tracker

Week 6: Saturday May 7: CDSW Session 3
As description in the section on technical skills above, I expect everybody who is not comfortable with at least basic programming and data collection to attend the Community Data Science Workshops (Spring 2016) which I am running concurrently with this class.

This session will run from 9am-3pm. Details on the CDSW Spring 2016 page.

Week 7: Monday May 9: Consulting Week (i.e., no group meeting)
During this week, we not meet together. Instead, I will schedule one-on-one in person meetings of an hour with each student individually to catch up with you about your project and to work directly with you to resolve any technical issues you have run into with data collection, etc.

Week 8: Monday May 16: Consulting Week (i.e., no group meeting)
During this week, we not meet together. Instead, I will schedule one-on-one in person meetings of an hour with each student individually to catch up with you about your project and to work directly with you to resolve any technical issues you have run into with data collect

Week 9: Monday May 23: Design Research
Today we'll have a guest visitor — Andrés Monroy-Hernández from Microsoft Resarch's FUSE labs and affiliate faculty in the Department of Communication and Department of Human-Centered Design and Engineering at UW. Monroy-Hernández research involves studying people by designing and building systems. He's built a number of very large and successful socio-technical systems as part of his research. In his graduate work, he build the Scratch Online Community which is now used by more than 10 million people.

I've asked him to come and talk to us about design research as a process. As a result, it will be helpful to read about two projects he has worked on recently that he will talked to us about. Those projects are called NewsPad and Eventful.

Required Readings:


 * Olsen, D. R., Jr. (2007). Evaluating User Interface Systems Research. In Proceedings of the 20th Annual ACM Symposium on User Interface Software and Technology (pp. 251–258). New York, NY, USA: ACM. [Available through UW Libraries]
 * J. Nathan Matias and Andres Monroy-Hernandez, NewsPad: Designing for Collaborative Storytelling in Neighborhoods. CHI Work in Progress Paper. ACM, March 2014.
 * Elena Agapie, Jaime Teevan, and Andrés Monroy-Hernández, Crowdsourcing in the Field: A Case Study Using Local Crowds for Event Reporting, in Human Computation (HCOMP), AAAI - Association for the Advancement of Artificial Intelligence, August 2015.
 * Two very short videos describing the systems: NewsPad by FUSE Labs and Eventful by FUSE Labs

Week 9: Wednesday May 25: Digital Trace and Sensor Data
Required Readings:

Read any 2 of these 4 chapters from the Handbook of Emerging Technology in Social Research:


 * Eagle, Nathan, "Mobile phones as sensors for social research," Ch. 22 in HET.
 * Visser, Albertine and Ingrid Mulder, "Emergent technologies for assessing social feelings and experiences," Ch. 16 in HET.
 * de Haan, Geert, et. al., "Bringing the research lab into everyday life: Exploiting sensitive environments to acquire data for social research," Ch. 23 in HET.
 * Fowler, Chris, et. al., "Living laboratories: Social research applications and evaluations," Ch. 27 in HET.
 * Holohan, Anne, et. al., "The digital home: A new locus of social science research," Ch. 28 in HET.

Not Covered: Hyperlink Networks

 * Jackson, Michele, (1997), "Assessing the Structure of the Communication on the World Wide Web," Journal of Computer-Mediated Communication, V. 3, N. 1, June, 1997.
 * Jackson, Michele, (2011), "What Should Researchers Infer From Links on an Organization's Site?", blog post at http://assett.colorado.edu/jackson/what-should-researchers-infer-from-links-on-an-organizations-site/
 * Olesen, Thomas (2004), "The Transnational Zapatista Solidarity Network: An Infrastructure Analysis," Global Networks, 4(1):89-107 [Although this article uses the term infrastructure analysis, the method employed is best described as a hyperlink network analysis.]

Attendance
As detailed in my page on assessment, attendance in class is expected of all participants. If you need to miss class for any reason, please contact me ahead of time (email is best). Multiple unexplained absences will likely result in a lower grade or (in extreme circumstances) a failing grade. In the event of an absence, you are responsible for obtaining class notes, handouts, assignments, etc.

Office Hours
I will not hold regular office hours. In general, I will be available to meet after class. Please contact me on email to arrange a meeting then or at another time.

Accommodations
In general, if you have an issue, such as needing an accommodation for a religious obligation or learning disability, speak with me before it affects your performance; afterward it is too late. Do not ask for favors; instead, offer proposals that show initiative and a willingness to work.

To request academic accommodations due to a disability please contact Disability Resources for Students, 448 Schmitz, 206-543-8924/V, 206-5430-8925/TTY. If you have a letter from Disability Resources for Students indicating that you have a disability that requires academic accommodations, please present the letter to me so we can discuss the accommodations that you might need for the class. I am happy to work with you to maximize your learning experience.

Academic Misconduct
I am committed to upholding the academic standards of the University of Washington’s Student Conduct Code. If I suspect a student violation of that code, I will first engage in a conversation with that student about my concerns.

If we cannot successfully resolve a suspected case of academic misconduct through our conversations, I will refer the situation to the department of communication advising office who can then work with the COM Chair to seek further input and if necessary, move the case up through the College.

While evidence of academic misconduct may result in a lower grade, I will not unilaterally lower a grade without addressing the issue with you first through the process outlined above.

Credit and Notes
This syllabus was inspired by, and borrows with permission from, a syallbus from an earlier version of this class taught by Kirsten Foot in Spring 2014.