HCDS (Fall 2017)
- Human Centered Data Science
- DATA 512 - UW Interdisciplinary Data Science Masters Program - Thursdays 5:00-9:50pm in Denny Hall 112.
- Instructor
- Jonathan T. Morgan
- TA
- Oliver Keyes
- Course Website
- We will use Canvas for announcements and turning in reading reflections, PAWS for turning in code, and Slack for Q&A and general discussion. All other course-related information will be linked on this page.
- Course Description
- Fundamental principles of data science and its human implications. Data ethics, data privacy, differential privacy, algorithmic bias, legal frameworks and intellectual property, provenance and reproducibility, data curation and preservation, user experience design and usability testing for big data, ethics of crowdwork, data communication and societal impacts of data science.
Goals and expectations
The format of the class will be a mix of lecture, discussion, analyzing data, in-class activities, short essay assignments, and programming exercises. Students will work in small groups. Instructors will provide guidance in completing the exercises each week.
Learning objectives
By the end of this course, students will be able to:
- Analyze large and complex data effectively and ethically with an understanding of human, societal, and socio-technical contexts.
- Develop algorithms that take into account the ethical, social, and legal considerations of large-scale data analysis.
- Discuss and evaluate ethical, social and legal trade-offs of different data analysis, testing, curation, and sharing methods
Grading
Grades will be determined as follows:
- 20% in-class work
- 20% readings/reading groups
- 60% assignments
Late assignments will not be accepted after the first week of class. In-class work and class participation cannot be made up. If you miss a class, you will receive a zero for the work done in class that day. Please do not ask the professor or TA what you missed during class; check the website or ask a classmate. Required posts to the class discussion board must be made before the due date or you will receive a zero for that work.
Final projects cannot be turned in late.
Schedule
Week 1: September 28
- Course overview
- What is data science? What is human centered? What is human centered data science?
- Assignments due
- fill out the pre-course survey
- Agenda
- Course overview & orientation
- What do we mean by "data science?"
- What do we mean by "human centered?"
- How does human centered design relate to data science?
- Readings assigned
- Watch: Why Humans Should Care About Data Science (Cecilia Aragon, 2016 HCDE Seminar Series)
- Read: Aragon, C. et al. (2016). Developing a Research Agenda for Human-Centered Data Science. Human Centered Data Science workshop, CSCW 2016.
- Read: Provost, Foster, and Tom Fawcett. Data science and its relationship to big data and data-driven decision making. Big Data 1.1 (2013): 51-59.
- Read: Kling, Rob and Star, Susan Leigh. Human Centered Systems in the Perspective of Organizational and Social Informatics. 1997.
- Homework assigned
- Reading reflection
- Resources
- Ideo.org The Field Guide to Human-Centered Design. 2015.
- Faraway, Julian. The Decline and Fall of Statistics. Faraway Statistics, 2015.
- Press, Gil. Data Science: What's The Half-Life Of A Buzzword? Forbes, 2013.
- Bloor, Robin. A Data Science Rant. Inside Analysis, 2013.
- Various authors. Position papers from 2016 CSCW Human Centered Data Science Workshop. 2016.
Week 2: October 5
- Ethical considerations in Data Science
- privacy, informed consent and user treatment
- Assignments due
- Week 1 reading reflection
- Agenda
- Informed consent in the age of Data Science
- Privacy
- User expectations
- Inferred information
- Correlation
- Anonymisation strategies
- Readings assigned
- Read: Markham, Annette and Buchanan, Elizabeth. Ethical Decision-Making and Internet Researchers. Association for Internet Research, 2012.
- Read: Barocas, Solan and Nissenbaum, Helen. Big Data's End Run around Anonymity and Consent. In Privacy, Big Data, and the Public Good. 2014. (PDF on Canvas)
- Homework assigned
- Reading reflection
- Resources
- Wittkower, D.E. Lurkers, creepers, and virtuous interactivity: From property rights to consent and care as a conceptual basis for privacy concerns and information ethics
- National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research. The Belmont Report. U.S. Department of Health and Human Services, 1979.
- Hill, Kashmir. Facebook Manipulated 689,003 Users' Emotions For Science. Forbes, 2014.
- Adam D. I. Kramer, Jamie E. Guillory, and Jeffrey T. Hancock Experimental evidence of massive-scale emotional contagion through social networks. PNAS 2014 111 (24) 8788-8790; published ahead of print June 2, 2014.
- Barbaro, Michael and Zeller, Tom. A Face Is Exposed for AOL Searcher No. 4417749. New York Times, 2008.
- Zetter, Kim. Arvind Narayanan Isn’t Anonymous, and Neither Are You. WIRED, 2012.
- Gray, Mary. When Science, Customer Service, and Human Subjects Research Collide. Now What? Culture Digitally, 2014.
- Tene, Omer and Polonetsky, Jules. Privacy in the Age of Big Data. Stanford Law Review, 2012.
- Dwork, Cynthia. Differential Privacy: A survey of results. Theory and Applications of Models of Computation , 2008.
- Green, Matthew. What is Differential Privacy? A Few Thoughts on Cryptographic Engineering, 2016.
- Hsu, Danny. Techniques to Anonymize Human Data. Data Sift, 2015.
- Metcalf, Jacob. Twelve principles of data ethics. Ethical Resolve, 2016.
- Poor, Nathaniel and Davidson, Roei. When The Data You Want Comes From Hackers, Or, Looking A Gift Horse In The Mouth. CSCW Human Centered Data Science Workshop, 2016.
Week 3: October 12
- Data provenance, preparation, and reproducibility
- data curation, preservation, documentation, and archiving; best practices for open scientific research
- Assignments due
- Week 2 reading reflection
- Agenda
- Final project overview
- Introduction to open research
- Understanding data licensing and attribution
- Supporting replicability and reproducibility
- Making your research and data accessible
- Working with Wikipedia datasets
- Assignment 1 description
- Readings assigned
- Read: Chapter 2 "Assessing Reproducibility" and Chapter 3 "The Basic Reproducible Workflow Template" from The Practice of Reproducible Research University of California Press, 2018.
- Read: Hickey, Walt. The Dollars and Cents Case Against Hollywood's Exclusion of Women. FiveThirtyEight, 2014. AND Keegan, Brian. The Need for Openness in Data Journalism. 2014.
- Homework assigned
- Reading reflection
- A1: Data curation
- Examples of well-documented open research projects
- Keegan, Brian. WeatherCrime. GitHub, 2014.
- Geiger, Stuart R. and Halfaker, Aaron. Operationalizing conflict and cooperation between automated software agents in Wikipedia: A replication and expansion of "Even Good Bots Fight". GitHub, 2017.
- Thain, Nithum; Dixon, Lucas; and Wulczyn, Ellery. Wikipedia Talk Labels: Toxicity. Figshare, 2017.
- Narayan, Sneha et al. Replication Data for: The Wikipedia Adventure: Field Evaluation of an Interactive Tutorial for New Users. Harvard Dataverse, 2017.
- Examples of not-so-well documented open research projects
- Eclarke. SWGA paper. GitHub, 2016.
- David Lefevre. Lefevre and Cox: Delayed instructional feedback may be more effective, but is this contrary to learners’ preferences? Figshare, 2016.
- Alneberg. CONCOCT Paper Data. GitHub, 2014.
- Other resources
- Press, Gil. Cleaning Big Data: Most Time-Consuming, Least Enjoyable Data Science Task, Survey Says. Forbes, 2016.
- Christensen, Garret. Manual of Best Practices in Transparent Social Science Research. 2016.
- Hickey, Walt. The Bechdel Test: Checking Our Work. FiveThirtyEight, 2014.
- Chapman et al. Cross Industry Standard Process for Data Mining. IBM, 2000.
Week 4: October 19
- Study design
- understanding your data; framing research questions; planning your study
- Assignments due
- Reading reflection
- A1: Data curation
- Agenda
- How Wikipedia works (and how it doesn't)
- guest speaker: Morten Warnke-Wang, Wikimedia Foundation
- Sources of bias in data science research
- Sources of bias in Wikipedia data
- Readings assigned
- Shyong (Tony) K. Lam, Anuradha Uduwage, Zhenhua Dong, Shilad Sen, David R. Musicant, Loren Terveen, and John Riedl. 2011. WP:clubhouse?: an exploration of Wikipedia's gender imbalance. In Proceedings of the 7th International Symposium on Wikis and Open Collaboration (WikiSym '11). ACM, New York, NY, USA, 1-10. DOI=http://dx.doi.org/10.1145/2038558.2038560
- Homework assigned
- Reading reflection
- A2: Bias in data
- Resources
- Aschwanden, Christie. Science Isn't Broken FiveThirtyEight, 2015.
- Halfaker, Aaron et al. The Rise and Decline of an Open Collaboration Community: How Wikipedia's reaction to sudden popularity is causing its decline. American Behavioral Scientist, 2012.
- Warnke-Wang, Morten. Autoconfirmed article creation trial. Wikimedia, 2017.
- Wikipedia Or Encyclopædia Britannica: Which Has More Bias?. Forbes, 2015. Based on Greenstein, Shane, and Feng Zhu.Do Experts or Collective Intelligence Write with More Bias? Evidence from Encyclopædia Britannica and Wikipedia. Harvard Business School working paper.
Week 5: October 26
- Machine learning
- ethical AI, algorithmic transparency, societal implications of machine learning
- Assignments due
- Reading reflection
- Agenda
- Social implications of machine learning
- Consequences of algorithmic bias
- Sources of algorithmic bias
- Addressing algorithmic bias
- Auditing algorithms
- Readings assigned
- Christian Sandvig, Kevin Hamilton, Karrie Karahalios, Cedric Langbort (2014/05/22) Auditing Algorithms: Research Methods for Detecting Discrimination on Internet Platforms. Paper presented to "Data and Discrimination: Converting Critical Concerns into Productive Inquiry," a preconference at the 64th Annual Meeting of the International Communication Association. May 22, 2014; Seattle, WA, USA.
- Homework assigned
- Reading reflection
- A3: Final project plan
- Resources
- Bamman, David Interpretability in Human-Centered Data Science. 2016 CSCW workshop on Human-Centered Data Science.
- Anderson, Carl. The role of model interpretability in data science. Medium, 2016.
- Hill, Kashmir. Facebook figured out my family secrets, and it won't tell me how. Engadget, 2017.
- Blue, Violet. Google’s comment-ranking system will be a hit with the alt-right. Engadget, 2017.
- Ingold, David and Soper, Spencer. Amazon Doesn’t Consider the Race of Its Customers. Should It?. Bloomberg, 2016.
- Mars, Roman. The Age of the Algorithm. 99% Invisible Podcast, 2017.
- Google's Perspective API
Week 6: November 2
- Mixed-methods research
- Big data vs thick data; qualitative research in data science
- Assignments due
- Reading reflection
- A2: Bias in data
- Agenda
- Guest speakers: Aaron Halfaker, Caroline Sinders (Wikimedia Foundation)
- Mixed methods research
- Ethnographic methods in data science
- Project plan brainstorm/Q&A session
- Readings assigned
- R. Stuart Geiger and Aaron Halfaker. 2017. Operationalizing conflict and cooperation between automated software agents in Wikipedia: A replication and expansion of Even Good Bots Fight. Proceedings of the ACM on Human-Computer Interaction (Nov 2017 issue, CSCW 2018 Online First) 1, 2, Article 49. DOI: https://doi.org/10.1145/3134684
- Homework assigned
- Reading reflection
- Resources
- Maximillian Klein. Gender by Wikipedia Language. Wikidata Human Gender Indicators (WHGI), 2017.
- Benjamin Collier and Julia Bear. Conflict, criticism, or confidence: an empirical examination of the gender gap in wikipedia contributions. In Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work (CSCW '12). DOI: https://doi.org/10.1145/2145204.2145265
- Christina Shane-Simpson, Kristen Gillespie-Lynch, Examining potential mechanisms underlying the Wikipedia gender gap through a collaborative editing task, In Computers in Human Behavior, Volume 66, 2017, https://doi.org/10.1016/j.chb.2016.09.043. (PDF on Canvas)
- Amanda Menking and Ingrid Erickson. 2015. The Heart Work of Wikipedia: Gendered, Emotional Labor in the World's Largest Online Encyclopedia. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15). https://doi.org/10.1145/2702123.2702514
- Andrea Forte, Nazanin Andalibi, and Rachel Greenstadt. Privacy, Anonymity, and Perceived Risk in Open Collaboration: A Study of Tor Users and Wikipedians. In Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing (CSCW '17). DOI: https://doi.org/10.1145/2998181.2998273
Week 7: November 9
- Human computation
- ethics of crowdwork, crowdsourcing methodologies for analysis, design, and evaluation
- Assignments due
- Reading reflection
- A3: Final project plan
- Agenda
- the role of qualitative research in human centered data science
- scaling qualitative research through crowdsourcing
- types of crowdwork
- ethical and practical considerations for crowdwork
- Introduction to assignment 4: Mechanical Turk ethnography
- Readings assigned (read both, reflect on one)
- Lilly C. Irani and M. Six Silberman. 2013. Turkopticon: interrupting worker invisibility in amazon mechanical turk. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). DOI: https://doi.org/10.1145/2470654.2470742
- Shilad Sen, Margaret E. Giesel, Rebecca Gold, Benjamin Hillmann, Matt Lesicko, Samuel Naden, Jesse Russell, Zixiao (Ken) Wang, and Brent Hecht. 2015. Turkers, Scholars, "Arafat" and "Peace": Cultural Communities and Algorithmic Gold Standards. In Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing (CSCW '15). DOI: http://dx.doi.org/10.1145/2675133.2675285
- Homework assigned
- Reading reflection
- A4: Crowdwork ethnography
- Resources
- WeArDynamo contributors. How to be a good requester and Guidelines for Academic Requesters. Wearedynamo.org
- Wang, Tricia. Why Big Data Needs Thick Data. Ethnography Matters, 2016.
Week 8: November 16
- User experience and big data
- user-centered design and evaluation of recommender systems; UI design for data science, collaborative visual analytics
- Assignments due
- Reading reflection
- Agenda
- HCD process in the design of data-driven applications
- understanding user needs, user intent, and context of use in recommender system design
- trust, empowerment, and seamful design
- HCD in data analysis and visualization
- final project lightning feedback sessions
- Readings assigned
- Michael D. Ekstrand, F. Maxwell Harper, Martijn C. Willemsen, and Joseph A. Konstan. 2014. User perception of differences in recommender algorithms. In Proceedings of the 8th ACM Conference on Recommender systems (RecSys '14). ACM, New York, NY, USA, 161-168. DOI: https://doi.org/10.1145/2645710.2645737
- Chen, N., Brooks, M., Kocielnik, R., Hong, R., Smith, J., Lin, S., Qu, Z., Aragon, C. Lariat: A visual analytics tool for social media researchers to explore Twitter datasets. Proceedings of the 50th Hawaii International Conference on System Sciences (HICSS), Data Analytics and Data Mining for Social Media Minitrack (2017)
- Homework assigned
- Reading reflection
- Resources
- Sean M. McNee, John Riedl, and Joseph A. Konstan. 2006. Making recommendations better: an analytic model for human-recommender interaction. In CHI '06 Extended Abstracts on Human Factors in Computing Systems (CHI EA '06). ACM, New York, NY, USA, 1103-1108. DOI=http://dx.doi.org/10.1145/1125451.1125660
- Kevin Crowston and the Gravity Spy Team. 2017. Gravity Spy: Humans, Machines and The Future of Citizen Science. In Companion of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing (CSCW '17 Companion). ACM, New York, NY, USA, 163-166. DOI: https://doi.org/10.1145/3022198.3026329
- Michael D. Ekstrand and Martijn C. Willemsen. 2016. Behaviorism is Not Enough: Better Recommendations through Listening to Users. In Proceedings of the 10th ACM Conference on Recommender Systems (RecSys '16). ACM, New York, NY, USA, 221-224. DOI: https://doi.org/10.1145/2959100.2959179
- Jess Holbrook. Human Centered Machine Learning. Google Design Blog. 2017.
- Xavier Amatriain and Justin Basilico. Netflix Recommendations: Beyond the 5 stars. Netflix Tech Blog, 2012.
- Fabien Girardin. Experience design in the machine learning era. Medium, 2016.
- Brian Whitman. How music recommendation works - and doesn't work. Variogram, 2012.
- Paul Lamere. How good is Google's Instant Mix?. Music Machinery, 2011.
- Snyder, Jaime. Values in the Design of Visualizations. 2016 CSCW workshop on Human-Centered Data Science.
Week 9: November 23
- Human-centered data science in the wild
- community data science; data science for social good
- Assignments due
- Reading reflection
- A4: Crowdwork ethnography
- Agenda
- NO CLASS - work on your own
- Readings assigned
- Hill, B. M., Dailey, D., Guy, R. T., Lewis, B., Matsuzaki, M., & Morgan, J. T. (2017). Democratizing Data Science: The Community Data Science Workshops and Classes. In N. Jullien, S. A. Matei, & S. P. Goggins (Eds.), Big Data Factories: Scientific Collaborative approaches for virtual community data collection, repurposing, recombining, and dissemination. New York, New York: Springer Nature. [Preprint/Draft PDF]
- Bivens, R. and Haimson, O.L. 2016. Baking Gender Into Social Media Design: How Platforms Shape Categories for Users and Advertisers. Social Media + Society. 2, 4 (2016), 205630511667248. DOI:https://doi.org/10.1177/2056305116672486.
- Schlesinger, A. et al. 2017. Intersectional HCI: Engaging Identity through Gender, Race, and Class. Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems - CHI ’17. (2017), 5412–5427. DOI:https://doi.org/10.1145/3025453.3025766.
- Homework assigned
- Reading reflection
- Resources
- Berney, Rachel, Bernease Herman, Gundula Proksch, Hillary Dawkins, Jacob Kovacs, Yahui Ma, Jacob Rich, and Amanda Tan. Visualizing Equity: A Data Science for Social Good Tool and Model for Seattle. Data Science for Social Good Conference, September 2017, Chicago, Illinois USA (2017).
- Sayamindu Dasgupta and Benjamin Mako Hill. Learning With Data: Designing for Community Introspection and Exploration. Position paper for Developing a Research Agenda for Human-Centered Data Science (a CSCW 2016 workshop).
Week 10: November 30
- Communicating methods, results, and implications
- translating for non-data scientists
- Assignments due
- Reading reflection
- Agenda
- communicating about your research effectively and honestly to different audiences
- publishing your research openly
- disseminating your research
- final project workshop
- Readings assigned
- Megan Risdal, Communicating data science: a guide to presenting your work. Kaggle blog, 2016.
- Marilynn Larkin, How to give a dynamic scientific presentation. Elsevier Connect, 2015.
- Homework assigned
- Reading reflection
- A5: Final presentation
- Resources
- Bart P. Knijnenburg, Martijn C. Willemsen, Zeno Gantner, Hakan Soncu, and Chris Newell. 2012. Explaining the user experience of recommender systems. User Modeling and User-Adapted Interaction 22, 4-5 (October 2012), 441-504. DOI=http://dx.doi.org/10.1007/s11257-011-9118-4
- Sean M. McNee, Nishikant Kapoor, and Joseph A. Konstan. 2006. Don't look stupid: avoiding pitfalls when recommending research papers. In Proceedings of the 2006 20th anniversary conference on Computer supported cooperative work (CSCW '06). ACM, New York, NY, USA, 171-180. DOI=http://dx.doi.org/10.1145/1180875.1180903
- Megan Risdal, Communicating data science: Why and how to visualize information. Kaggle blog, 2016.
- Megan Risdal, Communicating data science: an interview with a storytelling expert. Kaggle blog, 2016.
- Richard Garber, Power of brief speeches: World War I and the Four Minute Men. Joyful Public Speaking, 2010.
- Brent Dykes, Data Storytelling: The Essential Data Science Skill Everyone Needs. Forbes, 2016.
Week 11: December 7
- Future of human centered data science
- course wrap up, final presentations
- Assignments due
- Reading reflection
- A5: Final presentation
- Agenda
- future directions of of human centered data science
- final presentations
- Readings assigned
- none!
- Homework assigned
- none!
- Resources
- one
Week 12: Finals Week
- NO CLASS
- A6: FINAL PROJECT REPORT DUE BY 11:59PM on Sunday, December 10
- LATE PROJECT SUBMISSIONS NOT ACCEPTED.
Assignments
Assignments are comprised of weekly in-class activities, weekly reading reflections, written assignments, and programming/data analysis assignments. Weekly in-class reading groups will discuss the assigned readings from the course and students are expected to have read the material in advance. In class activities each week are posted to Canvas and may require time outside of class to complete.
Unless otherwise noted, all assignments are due before 5pm on the following week's class.
Unless otherwise noted, all assignments are individual assignments.
Assignment timeline
- Assignments due every week
- In-class activities - 2 points (weekly): In-class activity output posted to Canvas (group or individual)
- Reading reflections - 2 points (weekly): Reading reflections posted to Canvas (individual)
- Scheduled assignments
- A1 - 5 points (due Week 4): Data curation (programming/analysis)
- A2 - 10 points (due Week 6): Sources of bias in data (programming/analysis)
- A3 - 10 points (due Week 7): Final project plan (written)
- A4 - 10 points (due Week 9): Crowdwork self-ethnography (written)
- A5 - 10 points (due Week 11): Final project presentation (oral, written)
- A6 - 15 points (due by 11:59pm on Sunday, December 10): Final project report (programming/analysis, written)
Readings
coming soon
Policies
The following general policies apply to this course:
- Respect
- If there were only one policy allowed in a course syllabus, I would choose the word respect to represent our goals for a healthy and engaging educational environment. Treating each other respectfully, in the broadest sense and in all ways, is a necessary and probably sufficient condition for a successful experience together.
- Attendance
- Students are expected to attend class regularly.
- Late Assignments
- Late assignments will not be accepted. If your assignment is late, you will receive a zero score.
- Participation
- Active participation in class activities is one of the requirements of the course. You are expected to engage in group activities, class discussions, interactions with your peers, and constructive critiques as part of the course work. This will help you hone your communication and other professional skills.
- Collaboration
- Working in groups or on teams is an essential part of all data science disciplines. As part of this course, you will be asked to provide feedback of your peers' work.
- Academic Integrity
- Simply stated, academic integrity means that you are to do your own work in all of your classes, unless collaboration is part of an assignment as defined in the course. In any case, you must be responsible for citing and acknowledging outside sources of ideas in work you submit. Please be aware of the HCDE Department's and the UW's policies on this: HCDE Academic Conduct. These will be strictly enforced.
- Assignment Quality
- You are expected to produce work in all of the assignments that reflects the highest standards of professionalism. For written documents, this means proper spelling, grammar, and formatting.
- Privacy
- Students have the right for aspects of their personal life that they do not wish to share with others to remain private. Please respect that policy.
- Accommodations
- To request academic accommodations due to a disability, please contact Disabled Student Services: 448 Schmitz, 206-543-8924 (V/TTY). If you have a letter from DSS indicating that you have a disability which requires academic accommodations, please present the letter to me so you can discuss the accommodations you might need in the class.
- Permissions
- Unless you notify me otherwise, I will assume you will allow me to use samples from your work in this course in future instructional settings.
- Disclaimer
- This syllabus and all associated assignments, requirements, deadlines and procedures are subject to change.