User:Groceryheist/drafts/Data Science Syllabus: Difference between revisions
Groceryheist (talk | contribs) No edit summary |
Groceryheist (talk | contribs) No edit summary |
||
Line 169: | Line 169: | ||
;Assignments due | ;Assignments due | ||
* Reading reflection | * Reading reflection | ||
;Agenda | <!-- ;Agenda --> | ||
{{:HCDS (Fall 2018)/Day 4 plan}} | <!-- {{:HCDS (Fall 2018)/Day 4 plan}} --> | ||
;Readings assigned | ;Readings assigned (Read both, reflect on one) | ||
* Barley, S. R. (1986). Technology as an occasion for structuring: evidence from observations of ct scanners and the social order of radiology departments. Administrative Science Quarterly, 31(1), 78–108. | * Barley, S. R. (1986). Technology as an occasion for structuring: evidence from observations of ct scanners and the social order of radiology departments. Administrative Science Quarterly, 31(1), 78–108. | ||
* Orlikowski, W. J., & Barley, S. R. (2001). Technology and institutions: what can research on information technology and research on organizations learn from each other? MIS Q., 25(2), 145–165. https://doi.org/10.2307/3250927 | * Orlikowski, W. J., & Barley, S. R. (2001). Technology and institutions: what can research on information technology and research on organizations learn from each other? MIS Q., 25(2), 145–165. https://doi.org/10.2307/3250927 | ||
Line 200: | Line 199: | ||
; Assignments due | ; Assignments due | ||
* Week | * Week 4 reading reflection | ||
* [[Human_Centered_Data_Science_(Fall_2018)/Assignments#A1:_Data_curation|A1: Data curation]] | * [[Human_Centered_Data_Science_(Fall_2018)/Assignments#A1:_Data_curation|A1: Data curation]] | ||
Line 215: | Line 214: | ||
; Assignments due | ; Assignments due | ||
* Week | * Week 5 reading reflection | ||
* [[Human_Centered_Data_Science_(Fall_2018)/Assignments#A2:_Bias_in_data|A2: Bias in data]] | |||
;Readings assigned (Read both, reflect on one) | ;Readings assigned (Read both, reflect on one) | ||
Line 234: | Line 233: | ||
;Agenda | <!-- ;Agenda --> | ||
{{:HCDS (Fall 2018)/Day 5 plan}} | <!-- {{:HCDS (Fall 2018)/Day 5 plan}} --> | ||
;Readings assigned (Read both, reflect on one) | ;Readings assigned (Read both, reflect on one) | ||
Line 247: | Line 245: | ||
;Qualitative research methods resources | <!-- ;Qualitative research methods resources --> | ||
* Ladner, S. (2016). ''[http://www.practicalethnography.com/ Practical ethnography: A guide to doing ethnography in the private sector]''. Routledge. | <!-- * Ladner, S. (2016). ''[http://www.practicalethnography.com/ Practical ethnography: A guide to doing ethnography in the private sector]''. Routledge. --> | ||
* Spradley, J. P. (2016). ''[https://www.waveland.com/browse.php?t=688 The ethnographic interview]''. Waveland Press. | <!-- * Spradley, J. P. (2016). ''[https://www.waveland.com/browse.php?t=688 The ethnographic interview]''. Waveland Press. --> | ||
* Eriksson, P., & Kovalainen, A. (2015). ''[http://study.sagepub.com/sites/default/files/Eriksson%20and%20Kovalainen.pdf Ch 12: Ethnographic Research]''. In Qualitative methods in business research: A practical guide to social research. Sage. | <!-- * Eriksson, P., & Kovalainen, A. (2015). ''[http://study.sagepub.com/sites/default/files/Eriksson%20and%20Kovalainen.pdf Ch 12: Ethnographic Research]''. In Qualitative methods in business research: A practical guide to social research. Sage. --> | ||
* Usability.gov, ''[https://www.usability.gov/how-to-and-tools/methods/system-usability-scale.html System usability scale]''. | <!-- * Usability.gov, ''[https://www.usability.gov/how-to-and-tools/methods/system-usability-scale.html System usability scale]''. --> | ||
* Nielsen, Jakob (2000). ''[https://www.nngroup.com/articles/why-you-only-need-to-test-with-5-users/ Why you only need to test with five users]''. nngroup.com. | <!-- * Nielsen, Jakob (2000). ''[https://www.nngroup.com/articles/why-you-only-need-to-test-with-5-users/ Why you only need to test with five users]''. nngroup.com. --> | ||
;Wikipedia gender gap research resources | <!-- ;Wikipedia gender gap research resources --> | ||
* Hill, B. M., & Shaw, A. (2013). ''[journals.plos.org/plosone/article?id=10.1371/journal.pone.0065782 The Wikipedia gender gap revisited: Characterizing survey response bias with propensity score estimation]''. PloS one, 8(6), e65782 | <!-- * Hill, B. M., & Shaw, A. (2013). ''[journals.plos.org/plosone/article?id=10.1371/journal.pone.0065782 The Wikipedia gender gap revisited: Characterizing survey response bias with propensity score estimation]''. PloS one, 8(6), e65782 --> | ||
* Shyong (Tony) K. Lam, Anuradha Uduwage, Zhenhua Dong, Shilad Sen, David R. Musicant, Loren Terveen, and John Riedl. 2011. ''[http://files.grouplens.org/papers/wp-gender-wikisym2011.pdf WP:clubhouse?: an exploration of Wikipedia's gender imbalance.]'' In Proceedings of the 7th International Symposium on Wikis and Open Collaboration (WikiSym '11). ACM, New York, NY, USA, 1-10. DOI=http://dx.doi.org/10.1145/2038558.2038560 | <!-- * Shyong (Tony) K. Lam, Anuradha Uduwage, Zhenhua Dong, Shilad Sen, David R. Musicant, Loren Terveen, and John Riedl. 2011. ''[http://files.grouplens.org/papers/wp-gender-wikisym2011.pdf WP:clubhouse?: an exploration of Wikipedia's gender imbalance.]'' In Proceedings of the 7th International Symposium on Wikis and Open Collaboration (WikiSym '11). ACM, New York, NY, USA, 1-10. DOI=http://dx.doi.org/10.1145/2038558.2038560 --> | ||
* Maximillian Klein. ''[http://whgi.wmflabs.org/gender-by-language.html Gender by Wikipedia Language]''. Wikidata Human Gender Indicators (WHGI), 2017. | <!-- * Maximillian Klein. ''[http://whgi.wmflabs.org/gender-by-language.html Gender by Wikipedia Language]''. Wikidata Human Gender Indicators (WHGI), 2017. --> | ||
* Source: Wagner, C., Garcia, D., Jadidi, M., & Strohmaier, M. (2015, April). ''[https://www.aaai.org/ocs/index.php/ICWSM/ICWSM15/paper/viewFile/10585/10528 It's a Man's Wikipedia? Assessing Gender Inequality in an Online Encyclopedia]''. In ICWSM (pp. 454-463). | <!-- * Source: Wagner, C., Garcia, D., Jadidi, M., & Strohmaier, M. (2015, April). ''[https://www.aaai.org/ocs/index.php/ICWSM/ICWSM15/paper/viewFile/10585/10528 It's a Man's Wikipedia? Assessing Gender Inequality in an Online Encyclopedia]''. In ICWSM (pp. 454-463). --> | ||
* Benjamin Collier and Julia Bear. ''[https://static1.squarespace.com/static/521c8817e4b0dca2590b4591/t/523745abe4b05150ff027a6e/1379354027662/2012+-+Collier%2C+Bear+-+Conflict%2C+confidence%2C+or+criticism+an+empirical+examination+of+the+gender+gap+in+Wikipedia.pdf Conflict, criticism, or confidence: an empirical examination of the gender gap in wikipedia contributions]''. In Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work (CSCW '12). DOI: https://doi.org/10.1145/2145204.2145265 | <!-- * Benjamin Collier and Julia Bear. ''[https://static1.squarespace.com/static/521c8817e4b0dca2590b4591/t/523745abe4b05150ff027a6e/1379354027662/2012+-+Collier%2C+Bear+-+Conflict%2C+confidence%2C+or+criticism+an+empirical+examination+of+the+gender+gap+in+Wikipedia.pdf Conflict, criticism, or confidence: an empirical examination of the gender gap in wikipedia contributions]''. In Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work (CSCW '12). DOI: https://doi.org/10.1145/2145204.2145265 --> | ||
* Christina Shane-Simpson, Kristen Gillespie-Lynch, Examining potential mechanisms underlying the Wikipedia gender gap through a collaborative editing task, In Computers in Human Behavior, Volume 66, 2017, https://doi.org/10.1016/j.chb.2016.09.043. (PDF on Canvas) | <!-- * Christina Shane-Simpson, Kristen Gillespie-Lynch, Examining potential mechanisms underlying the Wikipedia gender gap through a collaborative editing task, In Computers in Human Behavior, Volume 66, 2017, https://doi.org/10.1016/j.chb.2016.09.043. (PDF on Canvas) --> | ||
* Amanda Menking and Ingrid Erickson. 2015. ''[https://upload.wikimedia.org/wikipedia/commons/7/77/The_Heart_Work_of_Wikipedia_Gendered,_Emotional_Labor_in_the_World%27s_Largest_Online_Encyclopedia.pdf The Heart Work of Wikipedia: Gendered, Emotional Labor in the World's Largest Online Encyclopedia]''. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15). https://doi.org/10.1145/2702123.2702514 | <!-- * Amanda Menking and Ingrid Erickson. 2015. ''[https://upload.wikimedia.org/wikipedia/commons/7/77/The_Heart_Work_of_Wikipedia_Gendered,_Emotional_Labor_in_the_World%27s_Largest_Online_Encyclopedia.pdf The Heart Work of Wikipedia: Gendered, Emotional Labor in the World's Largest Online Encyclopedia]''. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15). https://doi.org/10.1145/2702123.2702514 --> | ||
;Crowdwork research resources | <!-- ;Crowdwork research resources --> | ||
* WeArDynamo contributors. ''[http://wiki.wearedynamo.org/index.php?title=Basics_of_how_to_be_a_good_requester How to be a good requester]'' and ''[http://wiki.wearedynamo.org/index.php?title=Guidelines_for_Academic_Requesters Guidelines for Academic Requesters]''. Wearedynamo.org | <!-- * WeArDynamo contributors. ''[http://wiki.wearedynamo.org/index.php?title=Basics_of_how_to_be_a_good_requester How to be a good requester]'' and ''[http://wiki.wearedynamo.org/index.php?title=Guidelines_for_Academic_Requesters Guidelines for Academic Requesters]''. Wearedynamo.org --> | ||
Line 281: | Line 279: | ||
;Assignments due | ;Assignments due | ||
* Reading reflection | * Reading reflection | ||
;Agenda | <!-- ;Agenda --> | ||
{{:HCDS (Fall 2018)/Day 6 plan}} | <!-- {{:HCDS (Fall 2018)/Day 6 plan}} --> | ||
;Readings assigned | ;Readings assigned | ||
* Astrid Mager. 2012. ''[https://computingeverywhere.soc.northwestern.edu/wp-content/uploads/2017/07/Mager-Algorithmic-Ideology-Required.pdf Algorithmic ideology: How capitalist society shapes search engines]''. Information, Communication & Society 15, 5: 769–787. http://doi.org/10.1080/1369118X.2012.676056 | * Astrid Mager. 2012. ''[https://computingeverywhere.soc.northwestern.edu/wp-content/uploads/2017/07/Mager-Algorithmic-Ideology-Required.pdf Algorithmic ideology: How capitalist society shapes search engines]''. Information, Communication & Society 15, 5: 769–787. http://doi.org/10.1080/1369118X.2012.676056 | ||
;Homework assigned | ;Homework assigned | ||
* Reading reflection | * Reading reflection | ||
;Resources | ;Resources | ||
Line 327: | Line 321: | ||
;Assignments due | ;Assignments due | ||
* Reading reflection | * Reading reflection | ||
* A3: Crowdwork ethnography | * [[Human_Centered_Data_Science_(Fall_2018)/Assignments#A3:_Crowdwork_ethnography|A3: Crowdwork ethnography]] | ||
;Agenda | <!-- ;Agenda --> | ||
{{:HCDS (Fall 2018)/Day 7 plan}} | <!-- {{:HCDS (Fall 2018)/Day 7 plan}} --> | ||
;Readings assigned (read both, reflect on one) | ;Readings assigned (read both, reflect on one) | ||
Line 341: | Line 334: | ||
* [[Human_Centered_Data_Science_(Fall_2018)/Assignments#A4:_Final_project_plan|A4: Final project plan]] | * [[Human_Centered_Data_Science_(Fall_2018)/Assignments#A4:_Final_project_plan|A4: Final project plan]] | ||
<!-- ;Resources --> | |||
;Resources | <!-- * Neff, G., Tanweer, A., Fiore-Gartland, B., & Osburn, L. (2017). Critique and Contribute: A Practice-Based Framework for Improving Critical Data Studies and Data Science. Big Data, 5(2), 85–97. https://doi.org/10.1089/big.2016.0050 --> | ||
* Neff, G., Tanweer, A., Fiore-Gartland, B., & Osburn, L. (2017). Critique and Contribute: A Practice-Based Framework for Improving Critical Data Studies and Data Science. Big Data, 5(2), 85–97. https://doi.org/10.1089/big.2016.0050 | <!-- * Lilly C. Irani and M. Six Silberman. 2013. ''[https://escholarship.org/content/qt10c125z3/qt10c125z3.pdf Turkopticon: interrupting worker invisibility in amazon mechanical turk]''. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). DOI: https://doi.org/10.1145/2470654.2470742 --> | ||
* Lilly C. Irani and M. Six Silberman. 2013. ''[https://escholarship.org/content/qt10c125z3/qt10c125z3.pdf Turkopticon: interrupting worker invisibility in amazon mechanical turk]''. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). DOI: https://doi.org/10.1145/2470654.2470742 | <!-- * Bivens, R. and Haimson, O.L. 2016. ''[http://journals.sagepub.com/doi/pdf/10.1177/2056305116672486 Baking Gender Into Social Media Design: How Platforms Shape Categories for Users and Advertisers]''. Social Media + Society. 2, 4 (2016), 205630511667248. DOI:https://doi.org/10.1177/2056305116672486. --> | ||
* Bivens, R. and Haimson, O.L. 2016. ''[http://journals.sagepub.com/doi/pdf/10.1177/2056305116672486 Baking Gender Into Social Media Design: How Platforms Shape Categories for Users and Advertisers]''. Social Media + Society. 2, 4 (2016), 205630511667248. DOI:https://doi.org/10.1177/2056305116672486. | <!-- * Schlesinger, A. et al. 2017. ''[http://arischlesinger.com/wp-content/uploads/2017/03/chi2017-schlesinger-intersectionality.pdf Intersectional HCI: Engaging Identity through Gender, Race, and Class].'' Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems - CHI ’17. (2017), 5412–5427. DOI:https://doi.org/10.1145/3025453.3025766. --> | ||
* Schlesinger, A. et al. 2017. ''[http://arischlesinger.com/wp-content/uploads/2017/03/chi2017-schlesinger-intersectionality.pdf Intersectional HCI: Engaging Identity through Gender, Race, and Class].'' Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems - CHI ’17. (2017), 5412–5427. DOI:https://doi.org/10.1145/3025453.3025766. | |||
Line 359: | Line 351: | ||
;Human-centered algorithm design: ''algorithmic interpretibility; human-centered methods for designing and evaluating algorithmic systems'' | ;Human-centered algorithm design: ''algorithmic interpretibility; human-centered methods for designing and evaluating algorithmic systems'' | ||
;Assignments due | ;Assignments due | ||
* Reading reflection | * Reading reflection | ||
<!-- ;Agenda --> | |||
;Agenda | <!-- {{:HCDS (Fall 2018)/Day 8 plan}} --> | ||
{{:HCDS (Fall 2018)/Day 8 plan}} | |||
;Readings assigned | ;Readings assigned |
Revision as of 01:02, 9 February 2019
- Data Science and Organizational Communication
- Principal instructor
- Nate TeBlunthuis
- Course Catalog Description
- Fundamental principles of data science and its implications, including research ethics; data privacy; legal frameworks; algorithmic bias, transparency, fairness and accountability; data provenance, curation, preservation, and reproducibility; human computation; data communication and visualization; the role of data science in organizational context and the societal impacts of data science.
Course Description
The rise of "data science" reflects a broad and ongoing shift in how many teams, organizational leaders, communities of practice, and entire industries create and use knowledge. This class teaches "data science" as practiced by data-intensive knowledge workers but also as it is positioned in historical, organizational, institutional, and societal contexts. Students will gain an appriciation for the technical and intellectual aspects of data science, consider critical questions about how data science is often practiced, and envision ethical and effective science practice in their current and future organiational roles. The format of the class will be a mix of lecture, discussion, in-class activities, and qualitative and quantitative research assignments.
The course is designed around two high-stakes projects. In the first stage of the students will attend the Community Data Science Workshop (CDSC). I am one of the organizers and instructors of this three week intensive workshop on basic programming and data analysis skills. The first course project is to apply these skills together with the conceptual material from this course we have covered so far to conduct an original data analysis on a topic of the student's interest. The second high-stakes project is a critical analysis of an organization or work team. For this project students will serve as consultants to an organizational unit involved in data science. Through interviews and workplace observations they will gain an understanding of the socio-technical and organizational context of their team. They will then synthesize this understanding with the knowledge they gained from the course material to compose a report offering actionable insights to their team.
Learning Objectives
By the end of this course, students will be able to:
- Understand what it means to analyze large and complex data effectively and ethically with an understanding of human, societal, organizational, and socio-technical contexts.
- Consider the account ethical, social, organizational, and legal considerations of data science in organizational and institutional contexts.
- Combine quantitative and qualitative data to generate critical insights into human behavior.
- Discuss and evaluate ethical, social, organizational and legal trade-offs of different data analysis, testing, curation, and sharing methods.
Schedule
Course schedule (click to expand)
Week 1:
- Introduction to Human Centered Data Science
- What is data science? What is human centered? What is human centered data science?
- Assignments due
- Fill out the pre-course survey
- Attend week 1 of CDSW
- Read: Provost, Foster, and Tom Fawcett. Data science and its relationship to big data and data-driven decision making. Big Data 1.1 (2013): 51-59.
- Readings assigned
- Read: Barocas, Solan and Nissenbaum, Helen. Big Data's End Run around Anonymity and Consent. In Privacy, Big Data, and the Public Good. 2014.
- Homework assigned
- Reading reflection
- Attend week 2 of CDSW
- Kling, Rob and Star, Susan Leigh. Human Centered Systems in the Perspective of Organizational and Social Informatics. 1997
Week 2:
- Ethical considerations
- privacy, informed consent and user treatment
- Assignments due
- Week 1 reading reflection
- Readings assigned
- Read: boyd, danah and Crawford, Kate, Six Provocations for Big Data (September 21, 2011). A Decade in Internet Time: Symposium on the Dynamics of the Internet and Society, September 2011. Available at SSRN: https://ssrn.com/abstract=1926431 or http://dx.doi.org/10.2139/ssrn.1926431
- Homework assigned
- Reading reflection
- Attend week 2 of CDSW
- Assignment 1: Data curation
Week 3
- Reproducibility and Accountability
- data curation, preservation, documentation, and archiving; best practices for open scientific research
- Assignments due
- Week 2 reading reflection
- Attend week 2 of CDSW
- Readings assigned
- Read: Duarte, N., Llanso, E., & Loup, A. (2018). Mixed Messages? The Limits of Automated Social Media Content Analysis. Proceedings of the 1st Conference on Fairness, Accountability and Transparency, 81, 106.
- Homework assigned
- Reading reflection
- Attend week 3 of CDSW
- Chapter 2 "Assessing Reproducibility" and Chapter 3 "The Basic Reproducible Workflow Template" from The Practice of Reproducible Research University of California Press, 2018.
- sample code for API calls (view the notebook, download the notebook).
- See the datasets page for examples of well-documented and not-so-well documented open datasets.
Week 4: October 18
- Interrogating datasets
- causes and consequences of bias in data; best practices for selecting, describing, and implementing training data
- Assignments due
- Reading reflection
- Readings assigned (Read both, reflect on one)
- Barley, S. R. (1986). Technology as an occasion for structuring: evidence from observations of ct scanners and the social order of radiology departments. Administrative Science Quarterly, 31(1), 78–108.
- Orlikowski, W. J., & Barley, S. R. (2001). Technology and institutions: what can research on information technology and research on organizations learn from each other? MIS Q., 25(2), 145–165. https://doi.org/10.2307/3250927
- Homework assigned
- Reading reflection
- A1: Data curation
- Resources
- Olteanu, A., Castillo, C., Diaz, F., & Kiciman, E. (2016). Social data: Biases, methodological pitfalls, and ethical boundaries.
- Brian N Larson. 2017. Gender as a Variable in Natural-Language Processing: Ethical Considerations. EthNLP, 3: 30–40.
- Bender, E. M., & Friedman, B. (2018). Data Statements for NLP: Toward Mitigating System Bias and Enabling Better Science. To appear in Transactions of the ACL.
- Isaac L. Johnson, Yilun Lin, Toby Jia-Jun Li, Andrew Hall, Aaron Halfaker, Johannes Schöning, and Brent Hecht. 2016. Not at Home on the Range: Peer Production and the Urban/Rural Divide. CHI '16. DOI: https://doi.org/10.1145/2858036.2858123
- Leo Graiden Stewart, Ahmer Arif, A. Conrad Nied, Emma S. Spiro, and Kate Starbird. 2017. Drawing the Lines of Contention: Networked Frame Contests Within #BlackLivesMatter Discourse. Proc. ACM Hum.-Comput. Interact. 1, CSCW, Article 96 (December 2017), 23 pages. DOI: https://doi.org/10.1145/3134920
- Cristian Danescu-Niculescu-Mizil, Robert West, Dan Jurafsky, Jure Leskovec, and Christopher Potts. 2013. No country for old members: user lifecycle and linguistic change in online communities. In Proceedings of the 22nd international conference on World Wide Web (WWW '13). ACM, New York, NY, USA, 307-318. DOI: https://doi.org/10.1145/2488388.2488416
Week 5:
- Technology and Organizing
- Assignments due
- Week 4 reading reflection
- A1: Data curation
- Readings assigned
- Passi, S., & Jackson, S. J. (2018). Trust in Data Science: Collaboration, Translation, and Accountability in Corporate Data Science Projects. Proc. ACM Hum.-Comput. Interact., 2(CSCW), 136:1–136:28. https://doi.org/10.1145/3274405
- Homework Assigned
- Reading reflection
- A2: Bias in data
Week 6:
- Data science in Organizational Contexts
- Assignments due
- Week 5 reading reflection
- A2: Bias in data
- Readings assigned (Read both, reflect on one)
- Wang, Tricia. Why Big Data Needs Thick Data. Ethnography Matters, 2016.
- Shilad Sen, Margaret E. Giesel, Rebecca Gold, Benjamin Hillmann, Matt Lesicko, Samuel Naden, Jesse Russell, Zixiao (Ken) Wang, and Brent Hecht. 2015. Turkers, Scholars, "Arafat" and "Peace": Cultural Communities and Algorithmic Gold Standards. In Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing (CSCW '15)
Week 7: October 25
- Introduction to mixed-methods research
- Big data vs thick data; integrating qualitative research methods into data science practice; crowdsourcing
- Assignments due
- Reading reflection
- Readings assigned (Read both, reflect on one)
- Donovan, J., Caplan, R., Matthews, J., & Hanson, L. (2018). Algorithmic accountability: A primer. Data & Society, 501(c).
- Homework assigned
- Reading reflection
- A3: Crowdwork ethnography
Week 6: November 1
- Interrogating algorithms
- algorithmic fairness, transparency, and accountability; methods and contexts for algorithmic audits
- Assignments due
- Reading reflection
- Readings assigned
- Astrid Mager. 2012. Algorithmic ideology: How capitalist society shapes search engines. Information, Communication & Society 15, 5: 769–787. http://doi.org/10.1080/1369118X.2012.676056
- Homework assigned
- Reading reflection
- Resources
- Christian Sandvig, Kevin Hamilton, Karrie Karahalios, Cedric Langbort (2014/05/22) Auditing Algorithms: Research Methods for Detecting Discrimination on Internet Platforms. Paper presented to "Data and Discrimination: Converting Critical Concerns into Productive Inquiry," a preconference at the 64th Annual Meeting of the International Communication Association. May 22, 2014; Seattle, WA, USA.
- Shahriari, K., & Shahriari, M. (2017). IEEE standard review - Ethically aligned design: A vision for prioritizing human wellbeing with artificial intelligence and autonomous systems. Institute of Electrical and Electronics Engineers
- ACM US Policy Council Statement on Algorithmic Transparency and Accountability. January 2017.
- Asilomar AI Principles. Future of Life Institute, 2017.
- Diakopoulos, N., Friedler, S., Arenas, M., Barocas, S., Hay, M., Howe, B., … Zevenbergen, B. (2018). Principles for Accountable Algorithms and a Social Impact Statement for Algorithms. Fatml.Org 2018.
- Friedman, B., & Nissenbaum, H. (1996). Bias in Computer Systems. ACM Trans. Inf. Syst., 14(3), 330–347.
- Diakopoulos, N. (2014). Algorithmic accountability reporting: On the investigation of black boxes. Tow Center for Digital Journalism, 1–33. https://doi.org/10.1002/ejoc.201200111
- Nate Matias, 2017. How Anyone Can Audit Facebook's Newsfeed. Medium.com
- Hill, Kashmir. Facebook figured out my family secrets, and it won't tell me how. Engadget, 2017.
- Blue, Violet. Google’s comment-ranking system will be a hit with the alt-right. Engadget, 2017.
- Ingold, David and Soper, Spencer. Amazon Doesn’t Consider the Race of Its Customers. Should It?. Bloomberg, 2016.
- Paul Lamere. How good is Google's Instant Mix?. Music Machinery, 2011.
- Julia Angwin, Jeff Larson, Surya Mattu and Lauren Kirchner. Machine Bias: Risk Assessment in Criminal Sentencing. Propublica, May 2018.
- Google's Perspective API
Week 7: November 8
- Critical approaches to data science
- power, data, and society; ethics of crowdwork
- Assignments due
- Reading reflection
- A3: Crowdwork ethnography
- Readings assigned (read both, reflect on one)
- Read: Baumer, E. P. S. (2017). Toward human-centered algorithm design. Big Data & Society.
- Read: Amershi, S., Cakmak, M., Knox, W. B., & Kulesza, T. (2014). Power to the People: The Role of Humans in Interactive Machine Learning. AI Magazine, 35(4), 105.
- Homework assigned
- Reading reflection
- A4: Final project plan
Week 8: November 15
- Human-centered algorithm design
- algorithmic interpretibility; human-centered methods for designing and evaluating algorithmic systems
- Assignments due
- Reading reflection
- Readings assigned
- Hill, B. M., Dailey, D., Guy, R. T., Lewis, B., Matsuzaki, M., & Morgan, J. T. (2017). Democratizing Data Science: The Community Data Science Workshops and Classes. In N. Jullien, S. A. Matei, & S. P. Goggins (Eds.), Big Data Factories: Scientific Collaborative approaches for virtual community data collection, repurposing, recombining, and dissemination.
- Homework assigned
- Reading reflection
- Resources
- Ethical OS Toolkit and Risk Mitigation Checklist. EthicalOS.org.
- Morgan, J. 2016. Evaluating Related Articles recommendations. Wikimedia Research.
- Morgan, J. 2017. Comparing most read and trending edits for the top articles feature. Wikimedia Research.
- Michael D. Ekstrand, F. Maxwell Harper, Martijn C. Willemsen, and Joseph A. Konstan. 2014. User perception of differences in recommender algorithms. In Proceedings of the 8th ACM Conference on Recommender systems (RecSys '14).
- Sean M. McNee, John Riedl, and Joseph A. Konstan. 2006. Making recommendations better: an analytic model for human-recommender interaction. In CHI '06 Extended Abstracts on Human Factors in Computing Systems (CHI EA '06).
- Sean M. McNee, Nishikant Kapoor, and Joseph A. Konstan. 2006. Don't look stupid: avoiding pitfalls when recommending research papers. In Proceedings of the 2006 20th anniversary conference on Computer supported cooperative work (CSCW '06).
- Michael D. Ekstrand and Martijn C. Willemsen. 2016. Behaviorism is Not Enough: Better Recommendations through Listening to Users. In Proceedings of the 10th ACM Conference on Recommender Systems (RecSys '16).
- Jess Holbrook. Human Centered Machine Learning. Google Design Blog. 2017.
- Anderson, Carl. The role of model interpretability in data science. Medium, 2016.
Week 9: November 22 (No Class Session)
- Data science for social good
- Community-based and participatory approaches to data science; Using data science for society's benefit
- Assignments due
- Reading reflection
- A4: Final project plan
- Agenda
- Reading reflections discussion
- Feedback on Final Project Plans
- Guest lecture: Steven Drucker (Microsoft Research)
- UI patterns & UX considerations for ML/data-driven applications
- Final project presentation: what to expect
- In-class activity: final project peer review
- Readings assigned
- Berney, Rachel, Bernease Herman, Gundula Proksch, Hillary Dawkins, Jacob Kovacs, Yahui Ma, Jacob Rich, and Amanda Tan. Visualizing Equity: A Data Science for Social Good Tool and Model for Seattle. Data Science for Social Good Conference, September 2017, Chicago, Illinois USA (2017).
- Homework assigned
- Reading reflection
- Resources
- Daniela Aiello, Lisa Bates, et al. Eviction Lab Misses the Mark, ShelterForce, August 2018.
Week 10: November 29
- User experience and big data
- Design considerations for machine learning applications; human centered data visualization; data storytelling
- Assignments due
- Reading reflection
- Agenda
- Reading reflections discussion
- Feedback on Final Project Plans
- Guest lecture: Steven Drucker (Microsoft Research)
- UI patterns & UX considerations for ML/data-driven applications
- Final project presentation: what to expect
- In-class activity: final project peer review
- Readings assigned
- NONE
- Homework assigned
- A5: Final presentation
- Resources
- Fabien Girardin. Experience design in the machine learning era. Medium, 2016.
- Xavier Amatriain and Justin Basilico. Netflix Recommendations: Beyond the 5 stars. Netflix Tech Blog, 2012.
- Jess Holbrook. Human Centered Machine Learning. Google Design Blog. 2017.
- Bart P. Knijnenburg, Martijn C. Willemsen, Zeno Gantner, Hakan Soncu, and Chris Newell. 2012. Explaining the user experience of recommender systems. User Modeling and User-Adapted Interaction 22, 4-5 (October 2012), 441-504. DOI=http://dx.doi.org/10.1007/s11257-011-9118-4
- Patrick Austin, Facebook, Google, and Microsoft Use Design to Trick You Into Handing Over Your Data, New Report Warns. Gizmodo, 6/18/2018
- Brown, A., Tuor, A., Hutchinson, B., & Nichols, N. (2018). [Recurrent Neural Network Attention Mechanisms for Interpretable System Log Anomaly Detection. arXiv preprint arXiv:1803.04967.
- Cremonesi, P., Elahi, M., & Garzotto, F. (2017). User interface patterns in recommendation-empowered content intensive multimedia applications. Multimedia Tools and Applications, 76(4), 5275-5309.
- Marilynn Larkin, How to give a dynamic scientific presentation. Elsevier Connect, 2015.
- Megan Risdal, Communicating data science: a guide to presenting your work. Kaggle blog, 2016.
- Megan Risdal, Communicating data science: Why and how to visualize information. Kaggle blog, 2016.
- Megan Risdal, Communicating data science: an interview with a storytelling expert. Kaggle blog, 2016.
- Brent Dykes, Data Storytelling: The Essential Data Science Skill Everyone Needs. Forbes, 2016.
Week 11: December 6
- Final presentations
- course wrap up, presentation of student projects
- Assignments due
- A5: Final presentation
- Agenda
- Student final presentations
- Course wrap-up
- Readings assigned
- none!
- Homework assigned
- A6: Final project report (by 11:59pm)
- Resources
- one
Week 12: Finals Week (No Class Session)
- NO CLASS
- A6: FINAL PROJECT REPORT DUE BY 11:59PM