Difference between revisions of "Human Centered Data Science (Fall 2019)/Schedule"

From CommunityData
Line 230: Line 230:
 
[[:File:HCDS 2019 week 6 slides.pdf|Day 6 slides]]
 
[[:File:HCDS 2019 week 6 slides.pdf|Day 6 slides]]
 
-->
 
-->
;Interrogating algorithms: ''algorithmic fairness, transparency, and accountability; methods and contexts for algorithmic audits''
+
;Critical approaches to data science: ''power, data, and society; ethics of crowdwork''
 +
 
  
 
;Assignments due
 
;Assignments due
* Week 5 reading reflection
+
* Reading reflection
 
* A3: Crowdwork ethnography
 
* A3: Crowdwork ethnography
 +
  
 
;Agenda
 
;Agenda
{{:HCDS (Fall 2019)/Day 6 plan}}
+
{{:HCDS (Fall 2019)/Day 7 plan}}
 
 
;Readings assigned
 
* Astrid Mager. 2012. ''[https://computingeverywhere.soc.northwestern.edu/wp-content/uploads/2017/07/Mager-Algorithmic-Ideology-Required.pdf Algorithmic ideology: How capitalist society shapes search engines]''. Information, Communication & Society 15, 5: 769–787. http://doi.org/10.1080/1369118X.2012.676056
 
 
 
  
 +
;Readings assigned (read both, reflect on one)
 +
* Read: Baumer, E. P. S. (2017). ''[http://journals.sagepub.com/doi/pdf/10.1177/2053951717718854 Toward human-centered algorithm design].'' Big Data & Society.
 +
* Read: Amershi, S., Cakmak, M., Knox, W. B., & Kulesza, T. (2014). ''[http://www.aaai.org/ojs/index.php/aimagazine/article/download/2513/2456 Power to the People: The Role of Humans in Interactive Machine Learning].'' AI Magazine, 35(4), 105.
  
 
;Homework assigned
 
;Homework assigned
 
* Reading reflection
 
* Reading reflection
 +
* [[Human_Centered_Data_Science_(Fall_2018)/Assignments#A4:_Final_project_plan|A4: Final project plan]]
  
  
 
;Resources
 
;Resources
* Christian Sandvig, Kevin Hamilton, Karrie Karahalios, Cedric Langbort (2014/05/22) ''[http://www-personal.umich.edu/~csandvig/research/Auditing%20Algorithms%20--%20Sandvig%20--%20ICA%202014%20Data%20and%20Discrimination%20Preconference.pdf Auditing Algorithms: Research Methods for Detecting Discrimination on Internet Platforms].'' Paper presented to "Data and Discrimination: Converting Critical Concerns into Productive Inquiry," a preconference at the 64th Annual Meeting of the International Communication Association. May 22, 2014; Seattle, WA, USA.
+
* Neff, G., Tanweer, A., Fiore-Gartland, B., & Osburn, L. (2017). Critique and Contribute: A Practice-Based Framework for Improving Critical Data Studies and Data Science. Big Data, 5(2), 85–97. https://doi.org/10.1089/big.2016.0050
* Shahriari, K., & Shahriari, M. (2017). ''[https://ethicsinaction.ieee.org/ IEEE standard review - Ethically aligned design: A vision for prioritizing human wellbeing with artificial intelligence and autonomous systems].'' Institute of Electrical and Electronics Engineers
+
* Lilly C. Irani and M. Six Silberman. 2013. ''[https://escholarship.org/content/qt10c125z3/qt10c125z3.pdf Turkopticon: interrupting worker invisibility in amazon mechanical turk]''. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). DOI: https://doi.org/10.1145/2470654.2470742
* ACM US Policy Council ''[https://www.acm.org/binaries/content/assets/public-policy/2017_usacm_statement_algorithms.pdf Statement on Algorithmic Transparency and Accountability].'' January 2017.
+
* Bivens, R. and Haimson, O.L. 2016. ''[http://journals.sagepub.com/doi/pdf/10.1177/2056305116672486 Baking Gender Into Social Media Design: How Platforms Shape Categories for Users and Advertisers]''. Social Media + Society. 2, 4 (2016), 205630511667248. DOI:https://doi.org/10.1177/2056305116672486.  
* ''[https://futureoflife.org/ai-principles/ Asilomar AI Principles].'' Future of Life Institute, 2017.
+
* Schlesinger, A. et al. 2017. ''[http://arischlesinger.com/wp-content/uploads/2017/03/chi2017-schlesinger-intersectionality.pdf Intersectional HCI: Engaging Identity through Gender, Race, and Class].'' Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems - CHI ’17. (2017), 5412–5427. DOI:https://doi.org/10.1145/3025453.3025766.
* Diakopoulos, N., Friedler, S., Arenas, M., Barocas, S., Hay, M., Howe, B., … Zevenbergen, B. (2018). ''[http://www.fatml.org/resources/principles-for-accountable-algorithms Principles for Accountable Algorithms and a Social Impact Statement for Algorithms].'' Fatml.Org 2018.
 
* Friedman, B., & Nissenbaum, H. (1996). ''[https://www.vsdesign.org/publications/pdf/64_friedman.pdf Bias in Computer Systems]''. ACM Trans. Inf. Syst., 14(3), 330–347.
 
* Diakopoulos, N. (2014). Algorithmic accountability reporting: On the investigation of black boxes. Tow Center for Digital Journalism, 1–33. https://doi.org/10.1002/ejoc.201200111
 
* Nate Matias, 2017. ''[https://medium.com/@natematias/how-anyone-can-audit-facebooks-newsfeed-b879c3e29015 How Anyone Can Audit Facebook's Newsfeed].'' Medium.com
 
* Hill, Kashmir. ''[https://gizmodo.com/facebook-figured-out-my-family-secrets-and-it-wont-tel-1797696163 Facebook figured out my family secrets, and it won't tell me how].'' Engadget, 2017.
 
* Blue, Violet. ''[https://www.engadget.com/2017/09/01/google-perspective-comment-ranking-system/ Google’s comment-ranking system will be a hit with the alt-right].'' Engadget, 2017.
 
* Ingold, David and Soper, Spencer. ''[https://www.bloomberg.com/graphics/2016-amazon-same-day/ Amazon Doesn’t Consider the Race of Its Customers. Should It?].'' Bloomberg, 2016.
 
* Paul Lamere. ''[https://musicmachinery.com/2011/05/14/how-good-is-googles-instant-mix/ How good is Google's Instant Mix?].'' Music Machinery, 2011.
 
* Julia Angwin, Jeff Larson, Surya Mattu and Lauren Kirchner. ''[https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing Machine Bias: Risk Assessment in Criminal Sentencing]. Propublica, May 2018.
 
* [https://www.perspectiveapi.com/#/ Google's Perspective API]
 
 
 
 
 
 
 
 
<br/>
 
<br/>
 
<hr/>
 
<hr/>
Line 275: Line 264:
 
[[:File:HCDS 2019 week 7 slides.pdf|Day 7 slides]]
 
[[:File:HCDS 2019 week 7 slides.pdf|Day 7 slides]]
 
-->
 
-->
;Critical approaches to data science: ''power, data, and society; ethics of crowdwork''
+
;Human centered algorithm design: ''algorithmic fairness, transparency, and accountability; methods and contexts for algorithmic audits''
 
 
  
 
;Assignments due
 
;Assignments due
* Reading reflection
+
* Week 5 reading reflection
 
* A3: Crowdwork ethnography
 
* A3: Crowdwork ethnography
  
 +
;Agenda
 +
{{:HCDS (Fall 2019)/Day 6 plan}}
  
;Agenda
+
;Readings assigned
{{:HCDS (Fall 2019)/Day 7 plan}}
+
* Astrid Mager. 2012. ''[https://computingeverywhere.soc.northwestern.edu/wp-content/uploads/2017/07/Mager-Algorithmic-Ideology-Required.pdf Algorithmic ideology: How capitalist society shapes search engines]''. Information, Communication & Society 15, 5: 769–787. http://doi.org/10.1080/1369118X.2012.676056
  
;Readings assigned (read both, reflect on one)
 
* Read: Baumer, E. P. S. (2017). ''[http://journals.sagepub.com/doi/pdf/10.1177/2053951717718854 Toward human-centered algorithm design].'' Big Data & Society.
 
* Read: Amershi, S., Cakmak, M., Knox, W. B., & Kulesza, T. (2014). ''[http://www.aaai.org/ojs/index.php/aimagazine/article/download/2513/2456 Power to the People: The Role of Humans in Interactive Machine Learning].'' AI Magazine, 35(4), 105.
 
  
 
;Homework assigned
 
;Homework assigned
 
* Reading reflection
 
* Reading reflection
* [[Human_Centered_Data_Science_(Fall_2018)/Assignments#A4:_Final_project_plan|A4: Final project plan]]
 
  
  
 
;Resources
 
;Resources
* Neff, G., Tanweer, A., Fiore-Gartland, B., & Osburn, L. (2017). Critique and Contribute: A Practice-Based Framework for Improving Critical Data Studies and Data Science. Big Data, 5(2), 85–97. https://doi.org/10.1089/big.2016.0050
+
* Christian Sandvig, Kevin Hamilton, Karrie Karahalios, Cedric Langbort (2014/05/22) ''[http://www-personal.umich.edu/~csandvig/research/Auditing%20Algorithms%20--%20Sandvig%20--%20ICA%202014%20Data%20and%20Discrimination%20Preconference.pdf Auditing Algorithms: Research Methods for Detecting Discrimination on Internet Platforms].'' Paper presented to "Data and Discrimination: Converting Critical Concerns into Productive Inquiry," a preconference at the 64th Annual Meeting of the International Communication Association. May 22, 2014; Seattle, WA, USA.  
* Lilly C. Irani and M. Six Silberman. 2013. ''[https://escholarship.org/content/qt10c125z3/qt10c125z3.pdf Turkopticon: interrupting worker invisibility in amazon mechanical turk]''. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). DOI: https://doi.org/10.1145/2470654.2470742
+
* Shahriari, K., & Shahriari, M. (2017). ''[https://ethicsinaction.ieee.org/ IEEE standard review - Ethically aligned design: A vision for prioritizing human wellbeing with artificial intelligence and autonomous systems].'' Institute of Electrical and Electronics Engineers
* Bivens, R. and Haimson, O.L. 2016. ''[http://journals.sagepub.com/doi/pdf/10.1177/2056305116672486 Baking Gender Into Social Media Design: How Platforms Shape Categories for Users and Advertisers]''. Social Media + Society. 2, 4 (2016), 205630511667248. DOI:https://doi.org/10.1177/2056305116672486.  
+
* ACM US Policy Council ''[https://www.acm.org/binaries/content/assets/public-policy/2017_usacm_statement_algorithms.pdf Statement on Algorithmic Transparency and Accountability].'' January 2017.
* Schlesinger, A. et al. 2017. ''[http://arischlesinger.com/wp-content/uploads/2017/03/chi2017-schlesinger-intersectionality.pdf Intersectional HCI: Engaging Identity through Gender, Race, and Class].'' Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems - CHI ’17. (2017), 5412–5427. DOI:https://doi.org/10.1145/3025453.3025766.
+
* ''[https://futureoflife.org/ai-principles/ Asilomar AI Principles].'' Future of Life Institute, 2017.
 +
* Diakopoulos, N., Friedler, S., Arenas, M., Barocas, S., Hay, M., Howe, B., … Zevenbergen, B. (2018). ''[http://www.fatml.org/resources/principles-for-accountable-algorithms Principles for Accountable Algorithms and a Social Impact Statement for Algorithms].'' Fatml.Org 2018.
 +
* Friedman, B., & Nissenbaum, H. (1996). ''[https://www.vsdesign.org/publications/pdf/64_friedman.pdf Bias in Computer Systems]''. ACM Trans. Inf. Syst., 14(3), 330–347.
 +
* Diakopoulos, N. (2014). Algorithmic accountability reporting: On the investigation of black boxes. Tow Center for Digital Journalism, 1–33. https://doi.org/10.1002/ejoc.201200111
 +
* Nate Matias, 2017. ''[https://medium.com/@natematias/how-anyone-can-audit-facebooks-newsfeed-b879c3e29015 How Anyone Can Audit Facebook's Newsfeed].'' Medium.com
 +
* Hill, Kashmir. ''[https://gizmodo.com/facebook-figured-out-my-family-secrets-and-it-wont-tel-1797696163 Facebook figured out my family secrets, and it won't tell me how].'' Engadget, 2017.
 +
* Blue, Violet. ''[https://www.engadget.com/2017/09/01/google-perspective-comment-ranking-system/ Google’s comment-ranking system will be a hit with the alt-right].'' Engadget, 2017.
 +
* Ingold, David and Soper, Spencer. ''[https://www.bloomberg.com/graphics/2016-amazon-same-day/ Amazon Doesn’t Consider the Race of Its Customers. Should It?].'' Bloomberg, 2016.
 +
* Paul Lamere. ''[https://musicmachinery.com/2011/05/14/how-good-is-googles-instant-mix/ How good is Google's Instant Mix?].'' Music Machinery, 2011.
 +
* Julia Angwin, Jeff Larson, Surya Mattu and Lauren Kirchner. ''[https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing Machine Bias: Risk Assessment in Criminal Sentencing]. Propublica, May 2018.
 +
* [https://www.perspectiveapi.com/#/ Google's Perspective API]
 +
 
  
  

Revision as of 22:32, 4 September 2019

This page is a work in progress.


Week 1: September 26

Day 1 plan

Introduction to Human Centered Data Science
What is data science? What is human centered? What is human centered data science?
Assignments due
Agenda

HCDS (Fall 2019)/Day 1 plan

Readings assigned
Homework assigned
Resources




Week 2: October 3

Day 2 plan


Reproducibility and Accountability
data curation, preservation, documentation, and archiving; best practices for open scientific research
Assignments due
  • Week 1 reading reflection
  • A1: Data curation
Agenda

HCDS (Fall 2019)/Day 2 plan

Readings assigned
Homework assigned
Resources


Assignment 1 Data curation resources





Week 3: October 10

Day 3 plan

Interrogating datasets
causes and consequences of bias in data; best practices for selecting, describing, and implementing training data
Assignments due
  • Week 2 reading reflection
Agenda

HCDS (Fall 2019)/Day 3 plan

Readings assigned (Read both, reflect on one)
Homework assigned
  • Reading reflection
Resources




Week 4: October 17

Day 4 plan


Introduction to mixed-methods research
Big data vs thick data; integrating qualitative research methods into data science practice; crowdsourcing


Assignments due
  • Week 3 reading reflection
  • A2: Bias in data


Agenda

HCDS (Fall 2019)/Day 4 plan


Readings assigned (Read both, reflect on one)


Homework assigned


Qualitative research methods resources
Wikipedia gender gap research resources
Crowdwork research resources




Week 5: October 24

Day 5 plan

Ethical considerations
privacy, informed consent and user treatment


Assignments due
  • Week 4 reading reflection
Agenda

HCDS (Fall 2019)/Day 5 plan


Readings assigned


Homework assigned
  • Reading reflection


Resources




Week 6: October 31

Day 6 plan

Critical approaches to data science
power, data, and society; ethics of crowdwork


Assignments due
  • Reading reflection
  • A3: Crowdwork ethnography


Agenda

HCDS (Fall 2019)/Day 7 plan

Readings assigned (read both, reflect on one)
Homework assigned


Resources




Week 7: November 7

Day 7 plan

Human centered algorithm design
algorithmic fairness, transparency, and accountability; methods and contexts for algorithmic audits
Assignments due
  • Week 5 reading reflection
  • A3: Crowdwork ethnography
Agenda

HCDS (Fall 2019)/Day 6 plan

Readings assigned


Homework assigned
  • Reading reflection


Resources





Week 8: November 14

Day 8 plan

Human-centered algorithm design
algorithmic interpretibility; human-centered methods for designing and evaluating algorithmic systems


Assignments due
  • Reading reflection


Agenda

HCDS (Fall 2019)/Day 8 plan

Readings assigned
Homework assigned
  • Reading reflection
Resources





Week 9: November 21

Day 9 plan

Data science for social good
Community-based and participatory approaches to data science; Using data science for society's benefit
Assignments due
  • Reading reflection
  • A4: Final project plan
Agenda

HCDS (Fall 2019)/Day 9 plan

Readings assigned
Homework assigned
  • Reading reflection
Resources





Week 10: November 28 (No Class Session)

Readings assigned
  • NONE
Homework assigned
  • A5: Final presentation
Resources





Week 11: December 5

Day 11 plan

Final presentations
course wrap up, presentation of student projects


Assignments due
  • A5: Final presentation


Agenda

HCDS (Fall 2019)/Day 11 plan

Readings assigned
  • none!
Homework assigned
  • A6: Final project report (due 12/9 by 11:59pm)
Resources
  • one




Week 12: Finals Week (No Class Session)

  • NO CLASS
  • A6: FINAL PROJECT REPORT DUE BY 5:00PM on Tuesday, December 10
  • LATE PROJECT SUBMISSIONS NOT ACCEPTED.