Human Centered Data Science (Fall 2018)/Schedule: Difference between revisions

From CommunityData
Line 273: Line 273:
;Assignments due
;Assignments due
* Reading reflection
* Reading reflection
* A3: Crowdwork ethnography
 


;Agenda
;Agenda
Line 288: Line 288:
;Resources
;Resources
*Michael D. Ekstrand, F. Maxwell Harper, Martijn C. Willemsen, and Joseph A. Konstan. 2014. ''[https://md.ekstrandom.net/research/pubs/listcmp/listcmp.pdf User perception of differences in recommender algorithms].'' In Proceedings of the 8th ACM Conference on Recommender systems (RecSys '14). ACM, New York, NY, USA, 161-168. DOI: https://doi.org/10.1145/2645710.2645737
*Michael D. Ekstrand, F. Maxwell Harper, Martijn C. Willemsen, and Joseph A. Konstan. 2014. ''[https://md.ekstrandom.net/research/pubs/listcmp/listcmp.pdf User perception of differences in recommender algorithms].'' In Proceedings of the 8th ACM Conference on Recommender systems (RecSys '14). ACM, New York, NY, USA, 161-168. DOI: https://doi.org/10.1145/2645710.2645737
* Chen, N., Brooks, M., Kocielnik, R.,  Hong, R.,  Smith, J.,  Lin, S., Qu, Z., Aragon, C. ''[https://aisel.aisnet.org/cgi/viewcontent.cgi?article=1254&context=hicss-50 Lariat: A visual analytics tool for social media researchers to explore Twitter datasets].'' Proceedings of the 50th Hawaii International Conference on System Sciences (HICSS), Data Analytics and Data Mining for Social Media Minitrack (2017)
* Sean M. McNee, John Riedl, and Joseph A. Konstan. 2006. ''[http://files.grouplens.org/papers/mcnee-chi06-hri.pdf Making recommendations better: an analytic model for human-recommender interaction].'' In CHI '06 Extended Abstracts on Human Factors in Computing Systems (CHI EA '06). ACM, New York, NY, USA, 1103-1108. DOI=http://dx.doi.org/10.1145/1125451.1125660
* Sean M. McNee, John Riedl, and Joseph A. Konstan. 2006. ''[http://files.grouplens.org/papers/mcnee-chi06-hri.pdf Making recommendations better: an analytic model for human-recommender interaction].'' In CHI '06 Extended Abstracts on Human Factors in Computing Systems (CHI EA '06). ACM, New York, NY, USA, 1103-1108. DOI=http://dx.doi.org/10.1145/1125451.1125660
* Kevin Crowston and the Gravity Spy Team. 2017. ''[https://crowston.syr.edu/sites/crowston.syr.edu/files/cpa137-crowstonA.pdf Gravity Spy: Humans, Machines and The Future of Citizen Science].'' In Companion of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing (CSCW '17 Companion). ACM, New York, NY, USA, 163-166. DOI: https://doi.org/10.1145/3022198.3026329
* Michael D. Ekstrand and Martijn C. Willemsen. 2016. ''[https://md.ekstrandom.net/research/pubs/behaviorism/BehaviorismIsNotEnough.pdf Behaviorism is Not Enough: Better Recommendations through Listening to Users].'' In Proceedings of the 10th ACM Conference on Recommender Systems (RecSys '16). ACM, New York, NY, USA, 221-224. DOI: https://doi.org/10.1145/2959100.2959179
* Michael D. Ekstrand and Martijn C. Willemsen. 2016. ''[https://md.ekstrandom.net/research/pubs/behaviorism/BehaviorismIsNotEnough.pdf Behaviorism is Not Enough: Better Recommendations through Listening to Users].'' In Proceedings of the 10th ACM Conference on Recommender Systems (RecSys '16). ACM, New York, NY, USA, 221-224. DOI: https://doi.org/10.1145/2959100.2959179
* Jess Holbrook. ''[https://medium.com/google-design/human-centered-machine-learning-a770d10562cd Human Centered Machine Learning].'' Google Design Blog. 2017.
* Jess Holbrook. ''[https://medium.com/google-design/human-centered-machine-learning-a770d10562cd Human Centered Machine Learning].'' Google Design Blog. 2017.
* Xavier Amatriain and Justin Basilico. ''[https://medium.com/netflix-techblog/netflix-recommendations-beyond-the-5-stars-part-1-55838468f429 Netflix Recommendations: Beyond the 5 stars].'' Netflix Tech Blog, 2012.
*Fabien Girardin. ''[https://medium.com/@girardin/experience-design-in-the-machine-learning-era-e16c87f4f2e2 Experience design in the machine learning era].'' Medium, 2016.
* Brian Whitman. ''[https://notes.variogr.am/2012/12/11/how-music-recommendation-works-and-doesnt-work/ How music recommendation works - and doesn't work].'' Variogram, 2012.
* Brian Whitman. ''[https://notes.variogr.am/2012/12/11/how-music-recommendation-works-and-doesnt-work/ How music recommendation works - and doesn't work].'' Variogram, 2012.
* Paul Lamere. ''[https://musicmachinery.com/2011/05/14/how-good-is-googles-instant-mix/ How good is Google's Instant Mix?].'' Music Machinery, 2011.
* Paul Lamere. ''[https://musicmachinery.com/2011/05/14/how-good-is-googles-instant-mix/ How good is Google's Instant Mix?].'' Music Machinery, 2011.
* Snyder, Jaime. ''[https://cscw2016hcds.files.wordpress.com/2015/10/snyder_hcds20162.pdf Values in the Design of Visualizations].'' 2016 CSCW workshop on Human-Centered Data Science.


<br/>
<br/>

Revision as of 01:10, 17 September 2018

This page is a work in progress.


Week 1: September 27

Day 1 plan


Introduction to Human Centered Data Science
What is data science? What is human centered? What is human centered data science?
Assignments due
Agenda
  • Syllabus review
  • Pre-course survey results
  • What do we mean by data science?
  • What do we mean by human centered?
  • How does human centered design relate to data science?
  • Looking ahead: Week 2 assignments and topics


Readings assigned
  • Read: Barocas, Solan and Nissenbaum, Helen. Big Data's End Run around Anonymity and Consent. In Privacy, Big Data, and the Public Good. 2014. (PDF on Canvas)
Homework assigned
  • Reading reflection
Resources




Week 2: October 4

Day 2 plan


Ethical considerations
privacy, informed consent and user treatment


Assignments due
  • Week 1 reading reflection
Agenda
  • Intro to assignment 1: Data Curation
  • A brief history of research ethics
  • Guest lecture: Javier Salido and Mark van Hollebeke, "A Practitioners View of Privacy & Data Protection"
  • Guest lecture: Javier Salido, "Differential Privacy"
  • Contextual Integrity in data science
  • Week 2 reading reflection


Readings assigned
Homework assigned
  • Reading reflection
Resources




Week 3: October 11

Day 3 plan


Reproducibility and Accountability
data curation, preservation, documentation, and archiving; best practices for open scientific research
Assignments due
  • Week 2 reading reflection
Agenda
  • Six Provocations for Big Data: Review & Reflections
  • A primer on copyright, licensing, and hosting for code and data
  • Introduction to replicability, reproducibility, and open research
  • Reproducibility case study: fivethirtyeight.com
  • Group activity: assessing reproducibility in data journalism
  • Overview of Assignment 1: Data curation


Readings assigned
Homework assigned
Examples of well-documented open research projects
Examples of not-so-well documented open research projects
Other resources





Week 4: October 18

Day 4 plan


Interrogating datasets
bias in data; best practices for selecting, describing, and implementing training data


Assignments due
  • Reading reflection
  • A1: Data curation
Agenda
  • Final project: Goal, timeline, and deliverables.
  • Overview of assignment 2: Bias in data
  • Reading reflections review
  • Sources of bias in datasets
  • Introduction to assignment 2: Bias in data
  • Sources of bias in data collection and processing
  • In-class exercise: assessing bias in training data


Readings assigned
  • Read: Duarte, N., Llanso, E., & Loup, A. (2018). Mixed Messages? The Limits of Automated Social Media Content Analysis. Proceedings of the 1st Conference on Fairness, Accountability and Transparency, 81, 106. PDF: http://proceedings.mlr.press/v81/duarte18a.html
  • Read: Bender, E. M., & Friedman, B. (2018). Data Statements for NLP: Toward Mitigating System Bias and Enabling Better Science. T0 appear in Transactions of the ACL. PDF: https://openreview.net/forum?id=By4oPeX9f


Homework assigned
  • Reading reflection
  • A2: Bias in data


Resources




Week 5: October 25

Day 5 plan


Introduction to mixed-methods research
Big data vs thick data; qualitative research in data science


Assignments due
  • Reading reflection


Agenda
  • Assignment 1 review & reflection
  • Week 4 reading reflection discussion
  • Survey of qualitative research methods
  • Mixed-methods case study #1: The Wikipedia Gender Gap: causes & consequences
  • In-class activity: Automated Gender Recognition scenarios
  • Introduction to ethnography
  • Ethnographic research case study: Structured data on Wikimedia Commons
  • Introduction to crowdwork
  • Overview of Assignment 3: Crowdwork ethnography


Readings assigned
Homework assigned


Resources




Week 6: November 1

Day 6 plan


Interrogating algorithms
algorithmic transparency and accountability; methods and contexts for algorithmic audits
Assignments due
  • Reading reflection
  • A2: Bias in data
Agenda
  • Assignment 1 review & reflection
  • Week 4 reading reflection discussion
  • Survey of qualitative research methods
  • Mixed-methods case study #1: The Wikipedia Gender Gap: causes & consequences
  • In-class activity: Automated Gender Recognition scenarios
  • Introduction to ethnography
  • Ethnographic research case study: Structured data on Wikimedia Commons
  • Introduction to crowdwork
  • Overview of Assignment 3: Crowdwork ethnography


Readings assigned
Homework assigned
  • Reading reflection


Resources





Week 7: November 8

Day 7 plan

Critical approaches to data science
power, data & society, ethics of crowdwork


Assignments due
  • Reading reflection


Agenda
  • Guest lecture: Rochelle LaPlante


Readings assigned (read both, reflect on one)
Homework assigned


Resources




Week 8: November 15

Day 8 plan


Human-centered algorithm design
human-centered methods for designing and evaluating algorithmic systems


Assignments due
  • Reading reflection


Agenda
  • Final project overview & examples
  • Guest Lecture: Kelly Franznick, Blink UX
  • Reading reflections
  • Human-centered algorithm design
  • design process
  • user-driven evaluation
  • design patterns & anti-patterns


Readings assigned
Homework assigned
  • Reading reflection


Resources




Week 9: November 22 (No Class Session)

Day 9 plan

Data science for social good
TBD
Assignments due
  • Reading reflection
  • A4: Crowdwork ethnography
Agenda
  • Reading reflections discussion
  • Feedback on Final Project Plans
  • Guest lecture: Steven Drucker (Microsoft Research)
  • UI patterns & UX considerations for ML/data-driven applications
  • Final project presentation: what to expect
  • In-class activity: final project peer review


Readings assigned
Homework assigned
  • Reading reflection
Resources




Week 10: November 29

Day 10 plan

Day 10 slides

User experience and big data


Assignments due
  • Reading reflection


Agenda
  • Reading reflections discussion
  • Feedback on Final Project Plans
  • Guest lecture: Steven Drucker (Microsoft Research)
  • UI patterns & UX considerations for ML/data-driven applications
  • Final project presentation: what to expect
  • In-class activity: final project peer review


Readings assigned


Homework assigned
  • Reading reflection
  • A5: Final presentation
Resources




Week 11: December 6

Day 11 plan

Final presentations
course wrap up, presentation of student projects


Assignments due
  • Reading reflection
  • A5: Final presentation


Agenda
  • Student final presentations
  • Course wrap-up


Readings assigned
  • none!
Homework assigned
  • none!
Resources
  • one




Week 12: Finals Week (No Class Session)

  • NO CLASS
  • A6: FINAL PROJECT REPORT DUE BY 11:59PM on Sunday, December 9
  • LATE PROJECT SUBMISSIONS NOT ACCEPTED.