HCDS (Fall 2017): Difference between revisions
mNo edit summary |
|||
(18 intermediate revisions by 2 users not shown) | |||
Line 1: | Line 1: | ||
;Human Centered Data Science: [https://sdb.admin.uw.edu/timeschd/uwnetid/sln.asp?QTRYR=AUT+2017&SLN=23273 DATA 512] - [https://www.datasciencemasters.uw.edu/ UW Interdisciplinary Data Science Masters Program] - Thursdays 5:00-9:50pm in [http://www.washington.edu/maps/#!/den Denny Hall] 112. | |||
;Principal instructor: [http://jtmorgan.net Jonathan T. Morgan] | |||
;Co-instructor: Oliver Keyes | |||
;Course Website: ''This'' page is the canonical information resource for DATA512. We will use [https://canvas.uw.edu/courses/1174178 the Canvas site] for announcements, file hosting, and submitting reading reflections and graded in-class assignments. We will use Jupyter Hub (see Canvas for link) for turning in other programming and writing assignments, and Slack for Q&A and general discussion. All other course-related information will be linked on this page. | |||
;Course Description: Fundamental principles of data science and its human implications. Data ethics, data privacy, algorithmic bias, legal frameworks, provenance and reproducibility, data curation and preservation, user experience design and research for big data, ethics of crowdwork, data communication, and societal impacts of data science.<ref>https://www.washington.edu/students/crscat/data.html#data512</ref> | ;Course Description: Fundamental principles of data science and its human implications. Data ethics, data privacy, algorithmic bias, legal frameworks, provenance and reproducibility, data curation and preservation, user experience design and research for big data, ethics of crowdwork, data communication, and societal impacts of data science.<ref>https://www.washington.edu/students/crscat/data.html#data512</ref> | ||
Line 34: | Line 15: | ||
* Discuss and evaluate ethical, social and legal trade-offs of different data analysis, testing, curation, and sharing methods | * Discuss and evaluate ethical, social and legal trade-offs of different data analysis, testing, curation, and sharing methods | ||
== Course resources == | |||
''All pages and files on this wiki that are related to the Fall 2017 edition of DATA 512: Human-Centered Data Science are listed in [[:Category:HCDS (Fall 2017)]].'' | |||
=== Office hours === | |||
* Oliver: Monday (4pm-6pm) and Tuesday (4-7pm), Sieg 431, and by request. | |||
* Jonathan: Google Hangout, by request | |||
=== Jupyter Hub === | |||
The course will use a [http://jupyter.org/ Jupyter Hub] provided by [http://westbigdatahub.org/ West Big Data Hub] and administered by [https://bids.berkeley.edu/people/yuvi-panda Yuvi Panda] at the Berkeley Institute for Data Science. Students use Jupyter notebooks for in-class and homework assignments that involve a combination of programming, analysis, documentation, and reflection. Allowing students to work in a shared, online environment reinforces best practices around open research such as transparency, iteration, and reproducibility. It also helps teaches them how to tell the story of their research using multiple media (code, data, prose, and visualizations), making it more accessible and impactful for a wider variety of audiences. | |||
=== Datasets === | |||
For some examples of datasets you could use for your [[HCDS_(Fall_2017)/Assignments#A3:_Final_project_plan|final project]], see [[HCDS_(Fall_2017)/Datasets]]. | |||
=== Lecture slides === | |||
Slides for most weekly lectures are available in PDF form. | |||
* [[:File:HCDS_Week_1_slides.pdf|Week 1 slides]] | |||
* [[:File:HCDS_Week_2_slides.pdf|Week 2 slides]] | |||
* [[:File:HCDS_Week_3_slides.pdf|Week 3 slides]] | |||
* [[:File:HCDS_Week_4_slides.pdf|Week 4 slides]] | |||
* [[:File:HCDS_Week_5_slides.pdf|Week 5 slides]] | |||
* [[:File:HCDS_Week_6_slides.pdf|Week 6 slides]] | |||
* [[:File:HCDS_Week_8_slides.pdf|Week 8 slides]] | |||
* [[:File:HCDS_Week_10_slides.pdf|Week 10 slides]] | |||
== Schedule == | == Schedule == | ||
Line 45: | Line 50: | ||
== Assignments == | == Assignments == | ||
''[[HCDS (Fall 2017)/Assignments]]'' | ''For details on individual assignments, see [[HCDS (Fall 2017)/Assignments]]'' | ||
{{:HCDS (Fall 2017)/Assignments}} | {{:HCDS (Fall 2017)/Assignments}} | ||
<!--== Readings == | <!--== Readings == | ||
Line 62: | Line 63: | ||
</div>--> | </div>--> | ||
== Policies == | == Policies == | ||
Line 86: | Line 86: | ||
Active participation in class activities is one of the requirements of the course. You are expected to engage in group activities, class discussions, interactions with your peers, and constructive critiques as part of the course work. This will help you hone your communication and other professional skills. Correspondingly, working in groups or on teams is an essential part of all data science disciplines. As part of this course, you will be asked to provide feedback of your peers' work. | Active participation in class activities is one of the requirements of the course. You are expected to engage in group activities, class discussions, interactions with your peers, and constructive critiques as part of the course work. This will help you hone your communication and other professional skills. Correspondingly, working in groups or on teams is an essential part of all data science disciplines. As part of this course, you will be asked to provide feedback of your peers' work. | ||
The following grading scheme will be used to evaluate each of the 6 individual assignments (not reading reflections or graded in-class activities). | |||
;81-100% - Exceptional: The student demonstrated novelty or insight beyond the specific requirements of the assignment. | |||
;61-80% - Competent: The student competently and confidently addressed requirements to a good standard. | |||
;41-60% - Acceptable: The student met the absolute minimum requirements for the assignment. | |||
;21-40% - Partial: The student submitted something, but only addressed some of the assignment requirements or they submitted work that was poor quality overall. | |||
;1-20% - Submitted: The student submitted something. | |||
Individual assignments will have specific requirements listed on the assignment sheet, which the instructor will make available on the day the homework is assigned. If you have questions about how your assignment was graded, please see the TA or instructor. | |||
=== Assignments and coursework === | === Assignments and coursework === |
Latest revision as of 18:21, 8 August 2018
- Human Centered Data Science
- DATA 512 - UW Interdisciplinary Data Science Masters Program - Thursdays 5:00-9:50pm in Denny Hall 112.
- Principal instructor
- Jonathan T. Morgan
- Co-instructor
- Oliver Keyes
- Course Website
- This page is the canonical information resource for DATA512. We will use the Canvas site for announcements, file hosting, and submitting reading reflections and graded in-class assignments. We will use Jupyter Hub (see Canvas for link) for turning in other programming and writing assignments, and Slack for Q&A and general discussion. All other course-related information will be linked on this page.
- Course Description
- Fundamental principles of data science and its human implications. Data ethics, data privacy, algorithmic bias, legal frameworks, provenance and reproducibility, data curation and preservation, user experience design and research for big data, ethics of crowdwork, data communication, and societal impacts of data science.[1]
Overview and learning objectives[edit]
The format of the class will be a mix of lecture, discussion, analyzing data, in-class activities, short essay assignments, and programming exercises. Students will work in small groups. Instructors will provide guidance in completing the exercises each week.
By the end of this course, students will be able to:
- Analyze large and complex data effectively and ethically with an understanding of human, societal, and socio-technical contexts.
- Develop algorithms that take into account the ethical, social, and legal considerations of large-scale data analysis.
- Discuss and evaluate ethical, social and legal trade-offs of different data analysis, testing, curation, and sharing methods
Course resources[edit]
All pages and files on this wiki that are related to the Fall 2017 edition of DATA 512: Human-Centered Data Science are listed in Category:HCDS (Fall 2017).
Office hours[edit]
- Oliver: Monday (4pm-6pm) and Tuesday (4-7pm), Sieg 431, and by request.
- Jonathan: Google Hangout, by request
Jupyter Hub[edit]
The course will use a Jupyter Hub provided by West Big Data Hub and administered by Yuvi Panda at the Berkeley Institute for Data Science. Students use Jupyter notebooks for in-class and homework assignments that involve a combination of programming, analysis, documentation, and reflection. Allowing students to work in a shared, online environment reinforces best practices around open research such as transparency, iteration, and reproducibility. It also helps teaches them how to tell the story of their research using multiple media (code, data, prose, and visualizations), making it more accessible and impactful for a wider variety of audiences.
Datasets[edit]
For some examples of datasets you could use for your final project, see HCDS_(Fall_2017)/Datasets.
Lecture slides[edit]
Slides for most weekly lectures are available in PDF form.
- Week 1 slides
- Week 2 slides
- Week 3 slides
- Week 4 slides
- Week 5 slides
- Week 6 slides
- Week 8 slides
- Week 10 slides
Schedule[edit]
Course schedule (click to expand)
Week 1: September 28[edit]
- Course overview
- What is data science? What is human centered? What is human centered data science?
- Assignments due
- fill out the pre-course survey
- Agenda
- Course overview & orientation
- What do we mean by "data science?"
- What do we mean by "human centered?"
- How does human centered design relate to data science?
- Readings assigned
- Watch: Why Humans Should Care About Data Science (Cecilia Aragon, 2016 HCDE Seminar Series)
- Read: Aragon, C. et al. (2016). Developing a Research Agenda for Human-Centered Data Science. Human Centered Data Science workshop, CSCW 2016.
- Read: Provost, Foster, and Tom Fawcett. Data science and its relationship to big data and data-driven decision making. Big Data 1.1 (2013): 51-59.
- Read: Kling, Rob and Star, Susan Leigh. Human Centered Systems in the Perspective of Organizational and Social Informatics. 1997.
- Homework assigned
- Reading reflection
- Resources
- Ideo.org The Field Guide to Human-Centered Design. 2015.
- Faraway, Julian. The Decline and Fall of Statistics. Faraway Statistics, 2015.
- Press, Gil. Data Science: What's The Half-Life Of A Buzzword? Forbes, 2013.
- Bloor, Robin. A Data Science Rant. Inside Analysis, 2013.
- Various authors. Position papers from 2016 CSCW Human Centered Data Science Workshop. 2016.
Week 2: October 5[edit]
- Ethical considerations in Data Science
- privacy, informed consent and user treatment
- Assignments due
- Week 1 reading reflection
- Agenda
- Informed consent in the age of Data Science
- Privacy
- User expectations
- Inferred information
- Correlation
- Anonymisation strategies
- Readings assigned
- Read: Markham, Annette and Buchanan, Elizabeth. Ethical Decision-Making and Internet Researchers. Association for Internet Research, 2012.
- Read: Barocas, Solan and Nissenbaum, Helen. Big Data's End Run around Anonymity and Consent. In Privacy, Big Data, and the Public Good. 2014. (PDF on Canvas)
- Homework assigned
- Reading reflection
- Resources
- Wittkower, D.E. Lurkers, creepers, and virtuous interactivity: From property rights to consent and care as a conceptual basis for privacy concerns and information ethics
- National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research. The Belmont Report. U.S. Department of Health and Human Services, 1979.
- Hill, Kashmir. Facebook Manipulated 689,003 Users' Emotions For Science. Forbes, 2014.
- Adam D. I. Kramer, Jamie E. Guillory, and Jeffrey T. Hancock Experimental evidence of massive-scale emotional contagion through social networks. PNAS 2014 111 (24) 8788-8790; published ahead of print June 2, 2014.
- Barbaro, Michael and Zeller, Tom. A Face Is Exposed for AOL Searcher No. 4417749. New York Times, 2008.
- Zetter, Kim. Arvind Narayanan Isn’t Anonymous, and Neither Are You. WIRED, 2012.
- Gray, Mary. When Science, Customer Service, and Human Subjects Research Collide. Now What? Culture Digitally, 2014.
- Tene, Omer and Polonetsky, Jules. Privacy in the Age of Big Data. Stanford Law Review, 2012.
- Dwork, Cynthia. Differential Privacy: A survey of results. Theory and Applications of Models of Computation , 2008.
- Green, Matthew. What is Differential Privacy? A Few Thoughts on Cryptographic Engineering, 2016.
- Hsu, Danny. Techniques to Anonymize Human Data. Data Sift, 2015.
- Metcalf, Jacob. Twelve principles of data ethics. Ethical Resolve, 2016.
- Poor, Nathaniel and Davidson, Roei. When The Data You Want Comes From Hackers, Or, Looking A Gift Horse In The Mouth. CSCW Human Centered Data Science Workshop, 2016.
Week 3: October 12[edit]
- Data provenance, preparation, and reproducibility
- data curation, preservation, documentation, and archiving; best practices for open scientific research
- Assignments due
- Week 2 reading reflection
- Agenda
- Final project overview
- Introduction to open research
- Understanding data licensing and attribution
- Supporting replicability and reproducibility
- Making your research and data accessible
- Working with Wikipedia datasets
- Assignment 1 description
- Readings assigned
- Read: Chapter 2 "Assessing Reproducibility" and Chapter 3 "The Basic Reproducible Workflow Template" from The Practice of Reproducible Research University of California Press, 2018.
- Read: Hickey, Walt. The Dollars and Cents Case Against Hollywood's Exclusion of Women. FiveThirtyEight, 2014. AND Keegan, Brian. The Need for Openness in Data Journalism. 2014.
- Homework assigned
- Reading reflection
- A1: Data curation
- Examples of well-documented open research projects
- Keegan, Brian. WeatherCrime. GitHub, 2014.
- Geiger, Stuart R. and Halfaker, Aaron. Operationalizing conflict and cooperation between automated software agents in Wikipedia: A replication and expansion of "Even Good Bots Fight". GitHub, 2017.
- Thain, Nithum; Dixon, Lucas; and Wulczyn, Ellery. Wikipedia Talk Labels: Toxicity. Figshare, 2017.
- Narayan, Sneha et al. Replication Data for: The Wikipedia Adventure: Field Evaluation of an Interactive Tutorial for New Users. Harvard Dataverse, 2017.
- Examples of not-so-well documented open research projects
- Eclarke. SWGA paper. GitHub, 2016.
- David Lefevre. Lefevre and Cox: Delayed instructional feedback may be more effective, but is this contrary to learners’ preferences? Figshare, 2016.
- Alneberg. CONCOCT Paper Data. GitHub, 2014.
- Other resources
- Press, Gil. Cleaning Big Data: Most Time-Consuming, Least Enjoyable Data Science Task, Survey Says. Forbes, 2016.
- Christensen, Garret. Manual of Best Practices in Transparent Social Science Research. 2016.
- Hickey, Walt. The Bechdel Test: Checking Our Work. FiveThirtyEight, 2014.
- Chapman et al. Cross Industry Standard Process for Data Mining. IBM, 2000.
Week 4: October 19[edit]
- Study design
- understanding your data; framing research questions; planning your study
- Assignments due
- Reading reflection
- A1: Data curation
- Agenda
- How Wikipedia works (and how it doesn't)
- guest speaker: Morten Warnke-Wang, Wikimedia Foundation
- Sources of bias in data science research
- Sources of bias in Wikipedia data
- Readings assigned
- Shyong (Tony) K. Lam, Anuradha Uduwage, Zhenhua Dong, Shilad Sen, David R. Musicant, Loren Terveen, and John Riedl. 2011. WP:clubhouse?: an exploration of Wikipedia's gender imbalance. In Proceedings of the 7th International Symposium on Wikis and Open Collaboration (WikiSym '11). ACM, New York, NY, USA, 1-10. DOI=http://dx.doi.org/10.1145/2038558.2038560
- Homework assigned
- Reading reflection
- A2: Bias in data
- Resources
- Aschwanden, Christie. Science Isn't Broken FiveThirtyEight, 2015.
- Halfaker, Aaron et al. The Rise and Decline of an Open Collaboration Community: How Wikipedia's reaction to sudden popularity is causing its decline. American Behavioral Scientist, 2012.
- Warnke-Wang, Morten. Autoconfirmed article creation trial. Wikimedia, 2017.
- Wikipedia Or Encyclopædia Britannica: Which Has More Bias?. Forbes, 2015. Based on Greenstein, Shane, and Feng Zhu.Do Experts or Collective Intelligence Write with More Bias? Evidence from Encyclopædia Britannica and Wikipedia. Harvard Business School working paper.
Week 5: October 26[edit]
- Machine learning
- ethical AI, algorithmic transparency, societal implications of machine learning
- Assignments due
- Reading reflection
- Agenda
- Social implications of machine learning
- Consequences of algorithmic bias
- Sources of algorithmic bias
- Addressing algorithmic bias
- Auditing algorithms
- Readings assigned
- Christian Sandvig, Kevin Hamilton, Karrie Karahalios, Cedric Langbort (2014/05/22) Auditing Algorithms: Research Methods for Detecting Discrimination on Internet Platforms. Paper presented to "Data and Discrimination: Converting Critical Concerns into Productive Inquiry," a preconference at the 64th Annual Meeting of the International Communication Association. May 22, 2014; Seattle, WA, USA.
- Homework assigned
- Reading reflection
- A3: Final project plan
- Resources
- Bamman, David Interpretability in Human-Centered Data Science. 2016 CSCW workshop on Human-Centered Data Science.
- Anderson, Carl. The role of model interpretability in data science. Medium, 2016.
- Hill, Kashmir. Facebook figured out my family secrets, and it won't tell me how. Engadget, 2017.
- Blue, Violet. Google’s comment-ranking system will be a hit with the alt-right. Engadget, 2017.
- Ingold, David and Soper, Spencer. Amazon Doesn’t Consider the Race of Its Customers. Should It?. Bloomberg, 2016.
- Mars, Roman. The Age of the Algorithm. 99% Invisible Podcast, 2017.
- Google's Perspective API
Week 6: November 2[edit]
- Mixed-methods research
- Big data vs thick data; qualitative research in data science
- Assignments due
- Reading reflection
- A2: Bias in data
- Agenda
- Guest speakers: Aaron Halfaker, Caroline Sinders (Wikimedia Foundation)
- Mixed methods research
- Ethnographic methods in data science
- Project plan brainstorm/Q&A session
- Readings assigned
- R. Stuart Geiger and Aaron Halfaker. 2017. Operationalizing conflict and cooperation between automated software agents in Wikipedia: A replication and expansion of Even Good Bots Fight. Proceedings of the ACM on Human-Computer Interaction (Nov 2017 issue, CSCW 2018 Online First) 1, 2, Article 49. DOI: https://doi.org/10.1145/3134684
- Homework assigned
- Reading reflection
- Resources
- Maximillian Klein. Gender by Wikipedia Language. Wikidata Human Gender Indicators (WHGI), 2017.
- Benjamin Collier and Julia Bear. Conflict, criticism, or confidence: an empirical examination of the gender gap in wikipedia contributions. In Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work (CSCW '12). DOI: https://doi.org/10.1145/2145204.2145265
- Christina Shane-Simpson, Kristen Gillespie-Lynch, Examining potential mechanisms underlying the Wikipedia gender gap through a collaborative editing task, In Computers in Human Behavior, Volume 66, 2017, https://doi.org/10.1016/j.chb.2016.09.043. (PDF on Canvas)
- Amanda Menking and Ingrid Erickson. 2015. The Heart Work of Wikipedia: Gendered, Emotional Labor in the World's Largest Online Encyclopedia. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15). https://doi.org/10.1145/2702123.2702514
- Andrea Forte, Nazanin Andalibi, and Rachel Greenstadt. Privacy, Anonymity, and Perceived Risk in Open Collaboration: A Study of Tor Users and Wikipedians. In Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing (CSCW '17). DOI: https://doi.org/10.1145/2998181.2998273
Week 7: November 9[edit]
- Human computation
- ethics of crowdwork, crowdsourcing methodologies for analysis, design, and evaluation
- Assignments due
- Reading reflection
- A3: Final project plan
- Agenda
- the role of qualitative research in human centered data science
- scaling qualitative research through crowdsourcing
- types of crowdwork
- ethical and practical considerations for crowdwork
- Introduction to assignment 4: Mechanical Turk ethnography
- Readings assigned (read both, reflect on one)
- Lilly C. Irani and M. Six Silberman. 2013. Turkopticon: interrupting worker invisibility in amazon mechanical turk. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). DOI: https://doi.org/10.1145/2470654.2470742
- Shilad Sen, Margaret E. Giesel, Rebecca Gold, Benjamin Hillmann, Matt Lesicko, Samuel Naden, Jesse Russell, Zixiao (Ken) Wang, and Brent Hecht. 2015. Turkers, Scholars, "Arafat" and "Peace": Cultural Communities and Algorithmic Gold Standards. In Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing (CSCW '15). DOI: http://dx.doi.org/10.1145/2675133.2675285
- Homework assigned
- Reading reflection
- A4: Crowdwork ethnography
- Resources
- WeArDynamo contributors. How to be a good requester and Guidelines for Academic Requesters. Wearedynamo.org
- Wang, Tricia. Why Big Data Needs Thick Data. Ethnography Matters, 2016.
Week 8: November 16[edit]
- User experience and big data
- user-centered design and evaluation of recommender systems; UI design for data science, collaborative visual analytics
- Assignments due
- Reading reflection
- Agenda
- HCD process in the design of data-driven applications
- understanding user needs, user intent, and context of use in recommender system design
- trust, empowerment, and seamful design
- HCD in data analysis and visualization
- final project lightning feedback sessions
- Readings assigned
- Michael D. Ekstrand, F. Maxwell Harper, Martijn C. Willemsen, and Joseph A. Konstan. 2014. User perception of differences in recommender algorithms. In Proceedings of the 8th ACM Conference on Recommender systems (RecSys '14). ACM, New York, NY, USA, 161-168. DOI: https://doi.org/10.1145/2645710.2645737
- Chen, N., Brooks, M., Kocielnik, R., Hong, R., Smith, J., Lin, S., Qu, Z., Aragon, C. Lariat: A visual analytics tool for social media researchers to explore Twitter datasets. Proceedings of the 50th Hawaii International Conference on System Sciences (HICSS), Data Analytics and Data Mining for Social Media Minitrack (2017)
- Homework assigned
- Reading reflection
- Resources
- Sean M. McNee, John Riedl, and Joseph A. Konstan. 2006. Making recommendations better: an analytic model for human-recommender interaction. In CHI '06 Extended Abstracts on Human Factors in Computing Systems (CHI EA '06). ACM, New York, NY, USA, 1103-1108. DOI=http://dx.doi.org/10.1145/1125451.1125660
- Kevin Crowston and the Gravity Spy Team. 2017. Gravity Spy: Humans, Machines and The Future of Citizen Science. In Companion of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing (CSCW '17 Companion). ACM, New York, NY, USA, 163-166. DOI: https://doi.org/10.1145/3022198.3026329
- Michael D. Ekstrand and Martijn C. Willemsen. 2016. Behaviorism is Not Enough: Better Recommendations through Listening to Users. In Proceedings of the 10th ACM Conference on Recommender Systems (RecSys '16). ACM, New York, NY, USA, 221-224. DOI: https://doi.org/10.1145/2959100.2959179
- Jess Holbrook. Human Centered Machine Learning. Google Design Blog. 2017.
- Xavier Amatriain and Justin Basilico. Netflix Recommendations: Beyond the 5 stars. Netflix Tech Blog, 2012.
- Fabien Girardin. Experience design in the machine learning era. Medium, 2016.
- Brian Whitman. How music recommendation works - and doesn't work. Variogram, 2012.
- Paul Lamere. How good is Google's Instant Mix?. Music Machinery, 2011.
- Snyder, Jaime. Values in the Design of Visualizations. 2016 CSCW workshop on Human-Centered Data Science.
Week 9: November 23[edit]
- Human-centered data science in the wild
- community data science; data science for social good
- Assignments due
- Reading reflection
- A4: Crowdwork ethnography
- Agenda
- NO CLASS - work on your own
- Readings assigned
- Hill, B. M., Dailey, D., Guy, R. T., Lewis, B., Matsuzaki, M., & Morgan, J. T. (2017). Democratizing Data Science: The Community Data Science Workshops and Classes. In N. Jullien, S. A. Matei, & S. P. Goggins (Eds.), Big Data Factories: Scientific Collaborative approaches for virtual community data collection, repurposing, recombining, and dissemination. New York, New York: Springer Nature. [Preprint/Draft PDF]
- Bivens, R. and Haimson, O.L. 2016. Baking Gender Into Social Media Design: How Platforms Shape Categories for Users and Advertisers. Social Media + Society. 2, 4 (2016), 205630511667248. DOI:https://doi.org/10.1177/2056305116672486.
- Schlesinger, A. et al. 2017. Intersectional HCI: Engaging Identity through Gender, Race, and Class. Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems - CHI ’17. (2017), 5412–5427. DOI:https://doi.org/10.1145/3025453.3025766.
- Homework assigned
- Reading reflection
- Resources
- Berney, Rachel, Bernease Herman, Gundula Proksch, Hillary Dawkins, Jacob Kovacs, Yahui Ma, Jacob Rich, and Amanda Tan. Visualizing Equity: A Data Science for Social Good Tool and Model for Seattle. Data Science for Social Good Conference, September 2017, Chicago, Illinois USA (2017).
- Sayamindu Dasgupta and Benjamin Mako Hill. Learning With Data: Designing for Community Introspection and Exploration. Position paper for Developing a Research Agenda for Human-Centered Data Science (a CSCW 2016 workshop).
Week 10: November 30[edit]
- Communicating methods, results, and implications
- translating for non-data scientists
- Assignments due
- Reading reflection
- Agenda
- communicating about your research effectively and honestly to different audiences
- publishing your research openly
- disseminating your research
- final project workshop
- Readings assigned
- Megan Risdal, Communicating data science: a guide to presenting your work. Kaggle blog, 2016.
- Marilynn Larkin, How to give a dynamic scientific presentation. Elsevier Connect, 2015.
- Homework assigned
- Reading reflection
- A5: Final presentation
- Resources
- Bart P. Knijnenburg, Martijn C. Willemsen, Zeno Gantner, Hakan Soncu, and Chris Newell. 2012. Explaining the user experience of recommender systems. User Modeling and User-Adapted Interaction 22, 4-5 (October 2012), 441-504. DOI=http://dx.doi.org/10.1007/s11257-011-9118-4
- Sean M. McNee, Nishikant Kapoor, and Joseph A. Konstan. 2006. Don't look stupid: avoiding pitfalls when recommending research papers. In Proceedings of the 2006 20th anniversary conference on Computer supported cooperative work (CSCW '06). ACM, New York, NY, USA, 171-180. DOI=http://dx.doi.org/10.1145/1180875.1180903
- Megan Risdal, Communicating data science: Why and how to visualize information. Kaggle blog, 2016.
- Megan Risdal, Communicating data science: an interview with a storytelling expert. Kaggle blog, 2016.
- Richard Garber, Power of brief speeches: World War I and the Four Minute Men. Joyful Public Speaking, 2010.
- Brent Dykes, Data Storytelling: The Essential Data Science Skill Everyone Needs. Forbes, 2016.
Week 11: December 7[edit]
- Future of human centered data science
- course wrap up, final presentations
- Assignments due
- Reading reflection
- A5: Final presentation
- Agenda
- future directions of of human centered data science
- final presentations
- Readings assigned
- none!
- Homework assigned
- none!
- Resources
- one
Week 12: Finals Week[edit]
- NO CLASS
- A6: FINAL PROJECT REPORT DUE BY 11:59PM on Sunday, December 10
- LATE PROJECT SUBMISSIONS NOT ACCEPTED.
Assignments[edit]
For details on individual assignments, see HCDS (Fall 2017)/Assignments
Assignments are comprised of weekly in-class activities, weekly reading reflections, written assignments, and programming/data analysis assignments. Weekly in-class reading groups will discuss the assigned readings from the course and students are expected to have read the material in advance. In class activities each week are posted to Canvas and may require time outside of class to complete.
Unless otherwise noted, all assignments are due before 5pm on the following week's class.
Unless otherwise noted, all assignments are individual assignments.
Assignment timeline[edit]
- Assignments due every week
- In-class activities - 2 points (weekly): In-class activity output posted to Canvas (group or individual)
- Reading reflections - 2 points (weekly): Reading reflections posted to Canvas (individual)
- Scheduled assignments
- A1 - 5 points (due Week 4): Data curation (programming/analysis)
- A2 - 10 points (due Week 6): Sources of bias in data (programming/analysis)
- A3 - 10 points (due Week 7): Final project plan (written)
- A4 - 10 points (due Week 9): Crowdwork self-ethnography (written)
- A5 - 10 points (due Week 11): Final project presentation (oral, written)
- A6 - 15 points (due by 11:59pm on Sunday, December 10): Final project report (programming/analysis, written)
Policies[edit]
The following general policies apply to this course.
Respect[edit]
Students are expected to treat each other, and the instructors, with respect. Students are prohibited from engaging in any kind of harassment or derogatory behaviour, which includes offensive verbal comments or imagery related to gender, gender identity and expression, age, sexual orientation, disability, physical appearance, body size, race, ethnicity, or religion. In addition, students should not engage in any form of inappropriate physical contact or unwelcome sexual attention, and should respect each others’ right to privacy in regards to their personal life. In the event that you feel you (or another student) have been subject to a violation of this policy, please reach out to the instructors in whichever form you prefer.
The instructors are committed to providing a safe and healthy learning environment for students. As part of this, students are asked not to wear any clothing, jewelry, or any related medium for symbolic expression which depicts an indigenous person or cultural expression reappropriated as a mascot, logo, or caricature. These include, but are not limited to, iconography associated with the following sports teams:
- Chicago Blackhawks
- Washington Redskins
- Cleveland Indians
- Atlanta Braves
Attendance and participation[edit]
Students are expected to attend class regularly. If you run into a conflict that requires you to be absent (for example, medical issues) feel free to reach out to the instructors. We will do our best to ensure that you don’t miss out, and treat your information as confidential.
If you miss class session, please do not ask the professor or TA what you missed during class; check the website or ask a classmate (best bet: use Slack). Graded in-class activities cannot be made up if you miss a class session.
Grading[edit]
Active participation in class activities is one of the requirements of the course. You are expected to engage in group activities, class discussions, interactions with your peers, and constructive critiques as part of the course work. This will help you hone your communication and other professional skills. Correspondingly, working in groups or on teams is an essential part of all data science disciplines. As part of this course, you will be asked to provide feedback of your peers' work.
The following grading scheme will be used to evaluate each of the 6 individual assignments (not reading reflections or graded in-class activities).
- 81-100% - Exceptional
- The student demonstrated novelty or insight beyond the specific requirements of the assignment.
- 61-80% - Competent
- The student competently and confidently addressed requirements to a good standard.
- 41-60% - Acceptable
- The student met the absolute minimum requirements for the assignment.
- 21-40% - Partial
- The student submitted something, but only addressed some of the assignment requirements or they submitted work that was poor quality overall.
- 1-20% - Submitted
- The student submitted something.
Individual assignments will have specific requirements listed on the assignment sheet, which the instructor will make available on the day the homework is assigned. If you have questions about how your assignment was graded, please see the TA or instructor.
Assignments and coursework[edit]
Grades will be determined as follows:
- 20% in-class work
- 20% reading reflections
- 60% assignments
You are expected to produce work in all of the assignments that reflects the highest standards of professionalism. For written documents, this means proper spelling, grammar, and formatting.
Late assignments will not be accepted; if your assignment is late, you will receive a zero score. Again, if you run into an issue that necessitates an extension, please reach out. Final projects cannot be turned in late and are not eligible for any extension whatsoever.
Students are expected to adhere to rules around academic integrity. Simply stated, academic integrity means that you are to do your own work in all of your classes, unless collaboration is part of an assignment as defined in the course. In any case, you must be responsible for citing and acknowledging outside sources of ideas in work you submit. Please be aware of the HCDE Department's and the UW's policies on this: HCDE Academic Conduct. These will be strictly enforced.
Disability and accommodations[edit]
As part of ensuring that the class is as accessible as possible, the instructors are entirely comfortable with you using whatever form of note-taking method or recording is most comfortable to you, including laptops and audio recording devices. The instructors will do their best to ensure that all slides and scripts/notes are immediately available online after a lecture has concluded. In addition, we are going to try and record the audio of lectures for students who are more comfortable with audiovisual notes than written ones.
If you require additional accommodations, please contact Disabled Student Services: 448 Schmitz, 206-543-8924 (V/TTY). If you have a letter from DSS indicating that you have a disability which requires academic accommodations, please present the letter to the instructors so we can discuss the accommodations you might need in the class. If you have any questions about this policy, reach out to the instructors directly.
Disclaimer[edit]
This syllabus and all associated assignments, requirements, deadlines and procedures are subject to change.