Human Centered Data Science (Fall 2019)/Schedule: Difference between revisions

From CommunityData
 
(27 intermediate revisions by the same user not shown)
Line 46: Line 46:


;Resources
;Resources
* Princeton Dialogues on AI & Ethics: ''[https://aiethics.princeton.edu/case-studies/ Case studies]''
* Aragon, C. et al. (2016). [https://cscw2016hcds.files.wordpress.com/2015/10/cscw_2016_human-centered-data-science_workshop.pdf ''Developing a Research Agenda for Human-Centered Data Science.''] Human Centered Data Science workshop, CSCW 2016.
* Aragon, C. et al. (2016). [https://cscw2016hcds.files.wordpress.com/2015/10/cscw_2016_human-centered-data-science_workshop.pdf ''Developing a Research Agenda for Human-Centered Data Science.''] Human Centered Data Science workshop, CSCW 2016.
* Kling, Rob and Star, Susan Leigh. [https://scholarworks.iu.edu/dspace/bitstream/handle/2022/1798/wp97-04B.html ''Human Centered Systems in the Perspective of Organizational and Social Informatics.''] 1997.
* Kling, Rob and Star, Susan Leigh. [https://scholarworks.iu.edu/dspace/bitstream/handle/2022/1798/wp97-04B.html ''Human Centered Systems in the Perspective of Organizational and Social Informatics.''] 1997.
Line 79: Line 80:
;Resources
;Resources
* Hickey, Walt. [https://fivethirtyeight.com/features/the-bechdel-test-checking-our-work/ ''The Bechdel Test: Checking Our Work'']. FiveThirtyEight, 2014.
* Hickey, Walt. [https://fivethirtyeight.com/features/the-bechdel-test-checking-our-work/ ''The Bechdel Test: Checking Our Work'']. FiveThirtyEight, 2014.
* GroupLens, ''[https://grouplens.org/datasets/movielens/ MovieLens datasets]''
* J. Priem, D. Taraborelli, P. Groth, C. Neylon (2010), ''[http://altmetrics.org/manifesto Altmetrics: A manifesto]'', 26 October 2010.
* J. Priem, D. Taraborelli, P. Groth, C. Neylon (2010), ''[http://altmetrics.org/manifesto Altmetrics: A manifesto]'', 26 October 2010.
* Chapter 2 [https://www.practicereproducibleresearch.org/core-chapters/2-assessment.html "Assessing Reproducibility"] and Chapter 3 [https://www.practicereproducibleresearch.org/core-chapters/3-basic.html "The Basic Reproducible Workflow Template"] from ''The Practice of Reproducible Research'' University of California Press, 2018.  
* Chapter 2 [https://www.practicereproducibleresearch.org/core-chapters/2-assessment.html "Assessing Reproducibility"] and Chapter 3 [https://www.practicereproducibleresearch.org/core-chapters/3-basic.html "The Basic Reproducible Workflow Template"] from ''The Practice of Reproducible Research'' University of California Press, 2018.  
* Halfaker, A., Geiger, R. S., Morgan, J. T., & Riedl, J. (2013). ''[https://www-users.cs.umn.edu/~halfaker/publications/The_Rise_and_Decline/halfaker13rise-preprint.pdf The rise and decline of an open collaboration system: How Wikipedia’s reaction to popularity is causing its decline].'' American Behavioral Scientist, 57(5), 664-688
* TeBlunthuis, N., Shaw, A., and Hill, B.M. (2018). Revisiting "The rise and decline" in a population of peer production projects. In ''Proceedings of the 2018 ACM Conference on Human Factors in Computing Systems (CHI '18)''. https://doi.org/10.1145/3173574.3173929
* TeBlunthuis, N., Shaw, A., and Hill, B.M. (2018). Revisiting "The rise and decline" in a population of peer production projects. In ''Proceedings of the 2018 ACM Conference on Human Factors in Computing Systems (CHI '18)''. https://doi.org/10.1145/3173574.3173929
* Press, Gil. [https://www.forbes.com/sites/gilpress/2016/03/23/data-preparation-most-time-consuming-least-enjoyable-data-science-task-survey-says/#2608257f6f63 ''Cleaning Big Data: Most Time-Consuming, Least Enjoyable Data Science Task, Survey Says.''] Forbes, 2016.
* Press, Gil. [https://www.forbes.com/sites/gilpress/2016/03/23/data-preparation-most-time-consuming-least-enjoyable-data-science-task-survey-says/#2608257f6f63 ''Cleaning Big Data: Most Time-Consuming, Least Enjoyable Data Science Task, Survey Says.''] Forbes, 2016.
Line 117: Line 120:
* Gebru, T., Morgenstern, J., Vecchione, B., Vaughan, J. W., Wallach, H., Daumeé III, H., & Crawford, K. (2018). [https://www.fatml.org/media/documents/datasheets_for_datasets.pdf Datasheets for datasets]. arXiv preprint arXiv:1803.09010.
* Gebru, T., Morgenstern, J., Vecchione, B., Vaughan, J. W., Wallach, H., Daumeé III, H., & Crawford, K. (2018). [https://www.fatml.org/media/documents/datasheets_for_datasets.pdf Datasheets for datasets]. arXiv preprint arXiv:1803.09010.
* Olteanu, A., Castillo, C., Diaz, F., Kıcıman, E., & Kiciman, E. (2019). ''[https://www.frontiersin.org/articles/10.3389/fdata.2019.00013/pdf Social Data: Biases, Methodological Pitfalls, and Ethical Boundaries].'' Frontiers in Big Data, 2, 13. https://doi.org/10.3389/fdata.2019.00013
* Olteanu, A., Castillo, C., Diaz, F., Kıcıman, E., & Kiciman, E. (2019). ''[https://www.frontiersin.org/articles/10.3389/fdata.2019.00013/pdf Social Data: Biases, Methodological Pitfalls, and Ethical Boundaries].'' Frontiers in Big Data, 2, 13. https://doi.org/10.3389/fdata.2019.00013
* Rose Eveleth ''[https://www.vox.com/the-highlight/2019/10/1/20887003/tech-technology-evolution-natural-inevitable-ethics The biggest lie tech people tell themselves — and the rest of us].'' October 8, 2019, Vox.com.
* Rani Molla ''[https://www.vox.com/2019/2/8/18211794/government-data-internet The government is using the wrong data to make crucial decisions about the internet].'' February 8, 2019, Vox.com.
* Isaac L. Johnson, Yilun Lin, Toby Jia-Jun Li, Andrew Hall, Aaron Halfaker, Johannes Schöning, and Brent Hecht. 2016. Not at Home on the Range: Peer Production and the Urban/Rural Divide. DOI: https://doi.org/10.1145/2858036.2858123
* Leo Graiden Stewart, Ahmer Arif, A. Conrad Nied, Emma S. Spiro, and Kate Starbird. 2017. Drawing the Lines of Contention: Networked Frame Contests Within #BlackLivesMatter Discourse. CSCW 2017. DOI: https://doi.org/10.1145/3134920
* Lada A. Adamic and Natalie Glance. 2005. The political blogosphere and the 2004 U.S. election: divided they blog. (LinkKDD '05). DOI=http://dx.doi.org/10.1145/1134271.1134277
* Cristian Danescu-Niculescu-Mizil, Robert West, Dan Jurafsky, Jure Leskovec, and Christopher Potts. 2013. No country for old members: user lifecycle and linguistic change in online communities. (WWW '13). DOI: https://doi.org/10.1145/2488388.2488416
<br/>
<br/>
<hr/>
<hr/>
Line 126: Line 135:
-->
-->


;Introduction to mixed-methods research: ''Big data vs thick data; integrating qualitative research methods into data science practice; crowdsourcing''
;Introduction to qualitative and mixed-methods research: ''Big data vs thick data; integrating qualitative research methods into data science practice; crowdsourcing''


;Assignments due
;Assignments due
Line 133: Line 142:


;Agenda
;Agenda
* Reading reflection review
* Reading reflection reflection
* Review of assignment 2
* Overview of qualitative research
* Survey of qualitative research methods
* Mixed-methods case study
* Introduction to ethnography
* Introduction to ethnography
* Ethnographic research case study
* In-class activity: explaining art to aliens
* In-class activity
* Mixed methods research and data science
* Introduction to crowdwork
* An introduction to crowdwork
* Overview of Assignment 3: Crowdwork ethnography
* Overview of assignment 3: Crowdwork ethnography


;Homework assigned
;Homework assigned
Line 147: Line 154:
* [[Human_Centered_Data_Science_(Fall_2019)/Assignments#A3:_Crowdwork_ethnography|A3: Crowdwork ethnography]]
* [[Human_Centered_Data_Science_(Fall_2019)/Assignments#A3:_Crowdwork_ethnography|A3: Crowdwork ethnography]]


;Qualitative and mixed-methods research resources
;Resources
* Ford, D., Smith, J., Guo, P. J., & Parnin, C. (2016). ''[http://denaeford.me/papers/stack-overflow-barriers-FSE-2016.pdf Paradise unplugged: Identifying barriers for female participation on stack overflow]''. Proceedings of the ACM SIGSOFT Symposium on the Foundations of Software Engineering, 13-18-Nove, 846–857. https://doi.org/10.1145/2950290.2950331
* Singer, P., Lemmerich, F., West, R., Zia, L., Wulczyn, E., Strohmaier, M., & Leskovec, J. (2017, April). ''[https://arxiv.org/pdf/1702.05379.pdf Why we read wikipedia]''. In Proceedings of the 26th International Conference on World Wide Web.
* [https://meta.wikimedia.org/wiki/Research:The_role_of_citations_in_how_readers_evaluate_Wikipedia_articles/Trust_taxonomy Taxonomy of reasons why people trust/distrust Wikipedia], Jonathan Morgan, Wikimedia Research report, May 2019.
* Ladner, S. (2016). ''[http://www.practicalethnography.com/ Practical ethnography: A guide to doing ethnography in the private sector]''. Routledge.
* Ladner, S. (2016). ''[http://www.practicalethnography.com/ Practical ethnography: A guide to doing ethnography in the private sector]''. Routledge.
* Spradley, J. P. (2016). ''[https://www.waveland.com/browse.php?t=688 The ethnographic interview]''. Waveland Press.
* Spradley, J. P. (2016). ''[https://www.waveland.com/browse.php?t=688 The ethnographic interview]''. Waveland Press.
* Spradley, J. P. (2016) ''[https://www.waveland.com/browse.php?t=689 Participant Observation]''. Waveland Press  
* Spradley, J. P. (2016) ''[https://www.waveland.com/browse.php?t=689 Participant Observation]''. Waveland Press  
* Eriksson, P., & Kovalainen, A. (2015). ''[http://study.sagepub.com/sites/default/files/Eriksson%20and%20Kovalainen.pdf Ch 12: Ethnographic Research]''. In Qualitative methods in business research: A practical guide to social research. Sage.
* Eriksson, P., & Kovalainen, A. (2015). ''[http://study.sagepub.com/sites/default/files/Eriksson%20and%20Kovalainen.pdf Ch 12: Ethnographic Research]''. In Qualitative methods in business research: A practical guide to social research. Sage.
* ''[http://www.wou.edu/~girodm/library/zork.pdf Qualitative research activity: categorizing student responses].'' Mark Girod, Western Oregon University
* ''[https://cmci.colorado.edu/~palen/EmpiricalEpistemologiesforHCC-7.pdf Empirical    Epistemologies Applied to Human-­‐Centered Computing Research]'' Leysia Palen, University of Colorado Boulder, November 16 2014.
<!--
* Usability.gov, ''[https://www.usability.gov/how-to-and-tools/methods/system-usability-scale.html System usability scale]''.  
* Usability.gov, ''[https://www.usability.gov/how-to-and-tools/methods/system-usability-scale.html System usability scale]''.  
* Nielsen, Jakob (2000). ''[https://www.nngroup.com/articles/why-you-only-need-to-test-with-5-users/ Why you only need to test with five users]''. nngroup.com.
* Nielsen, Jakob (2000). ''[https://www.nngroup.com/articles/why-you-only-need-to-test-with-5-users/ Why you only need to test with five users]''. nngroup.com.
-->


;Crowdwork research resources
* WeArDynamo contributors. ''[https://web.archive.org/web/20190702152012/http://wiki.wearedynamo.org/index.php?title=Basics_of_how_to_be_a_good_requester How to be a good requester]'' and ''[https://web.archive.org/web/20181122143506/http://wiki.wearedynamo.org/index.php/Guidelines_for_Academic_Requesters Guidelines for Academic Requesters]''. Wearedynamo.org
<br/>
<br/>
<hr/>
<hr/>
Line 174: Line 185:
;Agenda
;Agenda
* Reading reflection review
* Reading reflection review
* Guest lecture
* A2 retrospective
* Final project deliverables and timeline
* A brief history of research ethics in the United States
* A brief history of research ethics in the United States
* Research ethics in data science
 
* Technological approaches to data privacy
* Guest lecture
* Procedural approaches to data privacy


;Homework assigned
;Homework assigned
Line 188: Line 199:
* Javier Salido (2012). ''[http://download.microsoft.com/download/D/1/F/D1F0DFF5-8BA9-4BDF-8924-7816932F6825/Differential_Privacy_for_Everyone.pdf Differential Privacy for Everyone].'' Microsoft Corporation Whitepaper.
* Javier Salido (2012). ''[http://download.microsoft.com/download/D/1/F/D1F0DFF5-8BA9-4BDF-8924-7816932F6825/Differential_Privacy_for_Everyone.pdf Differential Privacy for Everyone].'' Microsoft Corporation Whitepaper.
* Markham, Annette and Buchanan, Elizabeth. [https://aoir.org/reports/ethics2.pdf ''Ethical Decision-Making and Internet Researchers.''] Association for Internet Research, 2012.
* Markham, Annette and Buchanan, Elizabeth. [https://aoir.org/reports/ethics2.pdf ''Ethical Decision-Making and Internet Researchers.''] Association for Internet Research, 2012.
* Kelley, P. G., Bresee, J., Cranor, L. F., & Reeder, R. W. (2009). ''[http://cups.cs.cmu.edu/soups/2009/proceedings/a4-kelley.pdf A “nutrition label” for privacy.]'' Proceedings of the 5th Symposium on Usable Privacy and Security - SOUPS ’09, 1990, 1. https://doi.org/10.1145/1572532.1572538
* Warncke-Wang, M., Cosley, D., & Riedl, J. (2013). ''[https://opensym.org/wsos2013/proceedings/p0202-warncke.pdf Tell me more: An actionable quality model for wikipedia].'' Proceedings of the 9th International Symposium on Open Collaboration, WikiSym + OpenSym 2013. https://doi.org/10.1145/2491055.2491063
<!--
* Hill, Kashmir. [https://www.forbes.com/sites/kashmirhill/2014/06/28/facebook-manipulated-689003-users-emotions-for-science/#6a01653e197c ''Facebook Manipulated 689,003 Users' Emotions For Science.''] Forbes, 2014.
* Hill, Kashmir. [https://www.forbes.com/sites/kashmirhill/2014/06/28/facebook-manipulated-689003-users-emotions-for-science/#6a01653e197c ''Facebook Manipulated 689,003 Users' Emotions For Science.''] Forbes, 2014.
* Adam D. I. Kramer, Jamie E. Guillory, and Jeffrey T. Hancock [http://www.pnas.org/content/111/24/8788.full ''Experimental evidence of massive-scale emotional contagion through social networks.''] PNAS 2014 111 (24) 8788-8790; published ahead of print June 2, 2014.
* Adam D. I. Kramer, Jamie E. Guillory, and Jeffrey T. Hancock [http://www.pnas.org/content/111/24/8788.full ''Experimental evidence of massive-scale emotional contagion through social networks.''] PNAS 2014 111 (24) 8788-8790; published ahead of print June 2, 2014.
Line 193: Line 207:
* Zetter, Kim. [https://www.wired.com/2012/06/wmw-arvind-narayanan/ ''Arvind Narayanan Isn’t Anonymous, and Neither Are You.''] WIRED, 2012.
* Zetter, Kim. [https://www.wired.com/2012/06/wmw-arvind-narayanan/ ''Arvind Narayanan Isn’t Anonymous, and Neither Are You.''] WIRED, 2012.
* Gray, Mary. [http://culturedigitally.org/2014/07/when-science-customer-service-and-human-subjects-research-collide-now-what/ ''When Science, Customer Service, and Human Subjects Research Collide. Now What?''] Culture Digitally, 2014.
* Gray, Mary. [http://culturedigitally.org/2014/07/when-science-customer-service-and-human-subjects-research-collide-now-what/ ''When Science, Customer Service, and Human Subjects Research Collide. Now What?''] Culture Digitally, 2014.
* Tene, Omer and Polonetsky, Jules. [https://www.stanfordlawreview.org/online/privacy-paradox-privacy-and-big-data/ ''Privacy in the Age of Big Data.''] Stanford Law Review, 2012.
* Dwork, Cynthia. [https://www.microsoft.com/en-us/research/wp-content/uploads/2008/04/dwork_tamc.pdf ''Differential Privacy: A survey of results'']. Theory and Applications of Models of Computation , 2008.
* Dwork, Cynthia. [https://www.microsoft.com/en-us/research/wp-content/uploads/2008/04/dwork_tamc.pdf ''Differential Privacy: A survey of results'']. Theory and Applications of Models of Computation , 2008.
* Hsu, Danny. [http://blog.datasift.com/2015/04/09/techniques-to-anonymize-human-data/ ''Techniques to Anonymize Human Data.''] Data Sift, 2015.
* Hsu, Danny. [http://blog.datasift.com/2015/04/09/techniques-to-anonymize-human-data/ ''Techniques to Anonymize Human Data.''] Data Sift, 2015.
-->


<br/>
<br/>
Line 214: Line 228:
* Reading reflections
* Reading reflections
* Assignment 3 review
* Assignment 3 review
* Guest lecture: Stefania Druga
* In-class activity
* In-class activity
* Introduction to assignment 4: Final project proposal
* Introduction to assignment 4: Final project proposal
Line 225: Line 240:
;Resources
;Resources
* Lilly C. Irani and M. Six Silberman. 2013. ''[https://escholarship.org/content/qt10c125z3/qt10c125z3.pdf Turkopticon: interrupting worker invisibility in amazon mechanical turk]''. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). DOI: https://doi.org/10.1145/2470654.2470742
* Lilly C. Irani and M. Six Silberman. 2013. ''[https://escholarship.org/content/qt10c125z3/qt10c125z3.pdf Turkopticon: interrupting worker invisibility in amazon mechanical turk]''. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). DOI: https://doi.org/10.1145/2470654.2470742
* Salehi, Niloufar, Lilly C. Irani, Michael S. Bernstein, Ali Alkhatib, Eva Ogbe, and Kristy Milland. ''[https://hci.stanford.edu/publications/2015/dynamo/DynamoCHI2015.pdf We are dynamo: Overcoming stalling and friction in collective action for crowd workers]''. In Proceedings of the 33rd annual ACM conference on human factors in computing systems, pp. 1621-1630. ACM, 2015.
* Hill, B. M., Dailey, D., Guy, R. T., Lewis, B., Matsuzaki, M., & Morgan, J. T. (2017). ''[https://mako.cc/academic/hill_etal-cdsw_chapter-DRAFT.pdf Democratizing Data Science: The Community Data Science Workshops and Classes]''. In N. Jullien, S. A. Matei, & S. P. Goggins (Eds.), Big Data Factories: Scientific Collaborative approaches for virtual community data collection, repurposing, recombining, and dissemination. New York, New York: Springer Nature. https://doi.org/10.1007/978-3-319-59186-5_9
* Ingold, David and Soper, Spencer. ''[https://www.bloomberg.com/graphics/2016-amazon-same-day/ Amazon Doesn’t Consider the Race of Its Customers. Should It?].'' Bloomberg, 2016.
* Ingold, David and Soper, Spencer. ''[https://www.bloomberg.com/graphics/2016-amazon-same-day/ Amazon Doesn’t Consider the Race of Its Customers. Should It?].'' Bloomberg, 2016.
* Julia Angwin, Jeff Larson, Surya Mattu and Lauren Kirchner. ''[https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing Machine Bias: Risk Assessment in Criminal Sentencing]. Propublica, May 2018.
* Julia Angwin, Jeff Larson, Surya Mattu and Lauren Kirchner. ''[https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing Machine Bias: Risk Assessment in Criminal Sentencing]. Propublica, May 2018.
Line 254: Line 271:
;Resources
;Resources
* Christian Sandvig, Kevin Hamilton, Karrie Karahalios, Cedric Langbort (2014/05/22) ''[http://www-personal.umich.edu/~csandvig/research/Auditing%20Algorithms%20--%20Sandvig%20--%20ICA%202014%20Data%20and%20Discrimination%20Preconference.pdf Auditing Algorithms: Research Methods for Detecting Discrimination on Internet Platforms].'' Paper presented to "Data and Discrimination: Converting Critical Concerns into Productive Inquiry," a preconference at the 64th Annual Meeting of the International Communication Association. May 22, 2014; Seattle, WA, USA.  
* Christian Sandvig, Kevin Hamilton, Karrie Karahalios, Cedric Langbort (2014/05/22) ''[http://www-personal.umich.edu/~csandvig/research/Auditing%20Algorithms%20--%20Sandvig%20--%20ICA%202014%20Data%20and%20Discrimination%20Preconference.pdf Auditing Algorithms: Research Methods for Detecting Discrimination on Internet Platforms].'' Paper presented to "Data and Discrimination: Converting Critical Concerns into Productive Inquiry," a preconference at the 64th Annual Meeting of the International Communication Association. May 22, 2014; Seattle, WA, USA.  
* Shahriari, K., & Shahriari, M. (2017). ''[https://ethicsinaction.ieee.org/ IEEE standard review - Ethically aligned design: A vision for prioritizing human wellbeing with artificial intelligence and autonomous systems].'' Institute of Electrical and Electronics Engineers
* ACM US Policy Council ''[https://www.acm.org/binaries/content/assets/public-policy/2017_usacm_statement_algorithms.pdf Statement on Algorithmic Transparency and Accountability].'' January 2017.
* ''[https://futureoflife.org/ai-principles/ Asilomar AI Principles].'' Future of Life Institute, 2017.
* Diakopoulos, N., Friedler, S., Arenas, M., Barocas, S., Hay, M., Howe, B., … Zevenbergen, B. (2018). ''[http://www.fatml.org/resources/principles-for-accountable-algorithms Principles for Accountable Algorithms and a Social Impact Statement for Algorithms].'' Fatml.Org 2018.
* Friedman, B., & Nissenbaum, H. (1996). ''[https://www.vsdesign.org/publications/pdf/64_friedman.pdf Bias in Computer Systems]''. ACM Trans. Inf. Syst., 14(3), 330–347.
* Friedman, B., & Nissenbaum, H. (1996). ''[https://www.vsdesign.org/publications/pdf/64_friedman.pdf Bias in Computer Systems]''. ACM Trans. Inf. Syst., 14(3), 330–347.
* Nate Matias, 2017. ''[https://medium.com/@natematias/how-anyone-can-audit-facebooks-newsfeed-b879c3e29015 How Anyone Can Audit Facebook's Newsfeed].'' Medium.com
* Nate Matias, 2017. ''[https://medium.com/@natematias/how-anyone-can-audit-facebooks-newsfeed-b879c3e29015 How Anyone Can Audit Facebook's Newsfeed].'' Medium.com
* Hill, Kashmir. ''[https://gizmodo.com/facebook-figured-out-my-family-secrets-and-it-wont-tel-1797696163 Facebook figured out my family secrets, and it won't tell me how].'' Engadget, 2017.
* Hill, Kashmir. ''[https://gizmodo.com/facebook-figured-out-my-family-secrets-and-it-wont-tel-1797696163 Facebook figured out my family secrets, and it won't tell me how].'' Engadget, 2017.
* Blue, Violet. ''[https://www.engadget.com/2017/09/01/google-perspective-comment-ranking-system/ Google’s comment-ranking system will be a hit with the alt-right].'' Engadget, 2017.
* Blue, Violet. ''[https://www.engadget.com/2017/09/01/google-perspective-comment-ranking-system/ Google’s comment-ranking system will be a hit with the alt-right].'' Engadget, 2017.
* [https://www.perspectiveapi.com/#/ Google's Perspective API]
* Anderson, Carl. ''[https://medium.com/@leapingllamas/the-role-of-model-interpretability-in-data-science-703918f64330 The role of model interpretability in data science].'' Medium, 2016.
* Morgan, J. 2016. ''[https://meta.wikimedia.org/wiki/Research:Evaluating_RelatedArticles_recommendations Evaluating Related Articles recommendations]''. Wikimedia Research.
* Julia Angwin, Jeff Larson, Surya Mattu and Lauren Kirchner. ''[https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing Machine Bias: Risk Assessment in Criminal Sentencing]. Propublica, May 2018.
* Morgan, J. 2017. ''[https://meta.wikimedia.org/wiki/Research:Comparing_most_read_and_trending_edits_for_Top_Articles_feature Comparing most read and trending edits for the top articles feature]''. Wikimedia Research.
* Mitchell, M., Wu, S., Zaldivar, A., Barnes, P., Vasserman, L., Hutchinson, B., … Gebru, T. (2019). Model Cards for Model Reporting. Proceedings of the Conference on Fairness, Accountability, and Transparency, 220–229. https://doi.org/10.1145/3287560.3287596
*Michael D. Ekstrand, F. Maxwell Harper, Martijn C. Willemsen, and Joseph A. Konstan. 2014. ''[https://md.ekstrandom.net/research/pubs/listcmp/listcmp.pdf User perception of differences in recommender algorithms].'' In Proceedings of the 8th ACM Conference on Recommender systems (RecSys '14).
* Hosseini, H., Kannan, S., Zhang, B., & Poovendran, R. (2017). Deceiving Google’s Perspective API Built for Detecting Toxic Comments. ArXiv:1702.08138 [Cs]. Retrieved from http://arxiv.org/abs/1702.08138
* Binns, R., Veale, M., Van Kleek, M., & Shadbolt, N. (2017). Like trainer, like bot? Inheritance of bias in algorithmic content moderation. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 10540 LNCS, 405–415. https://doi.org/10.1007/978-3-319-67256-4_32
* Borkan, D., Dixon, L., Sorensen, J., Thain, N., & Vasserman, L. (2019). Nuanced Metrics for Measuring Unintended Bias with Real Data for Text Classification. 2, 491–500. https://doi.org/10.1145/3308560.3317593
* Zhang, J., Chang, J., Danescu-Niculescu-Mizil, C., Dixon, L., Hua, Y., Taraborelli, D., & Thain, N. (2019). Conversations Gone Awry: Detecting Early Signs of Conversational Failure. 1350–1361. https://doi.org/10.18653/v1/p18-1125
* Miriam Redi, Besnik Fetahu, Jonathan T. Morgan, and Dario Taraborelli. 2019. ''[https://arxiv.org/pdf/1902.11116.pdf Citation Needed a Taxonomy and Algorithmic Assessment of Wikipedia’s Verifiability].'' The Web Conference.
*[https://www.perspectiveapi.com/#/ Google's Perspective API]


<br/>
<br/>
Line 297: Line 315:


;Resources
;Resources
* Ethical OS ''[https://ethicalos.org/wp-content/uploads/2018/08/Ethical-OS-Toolkit-2.pdf Toolkit]'' and ''[https://ethicalos.org/wp-content/uploads/2018/08/EthicalOS_Check-List_080618.pdf Risk Mitigation Checklist]''. EthicalOS.org.
* Sean M. McNee, John Riedl, and Joseph A. Konstan. 2006. ''[http://files.grouplens.org/papers/mcnee-chi06-hri.pdf Making recommendations better: an analytic model for human-recommender interaction].'' In CHI '06 Extended Abstracts on Human Factors in Computing Systems (CHI EA '06).
* Sean M. McNee, John Riedl, and Joseph A. Konstan. 2006. ''[http://files.grouplens.org/papers/mcnee-chi06-hri.pdf Making recommendations better: an analytic model for human-recommender interaction].'' In CHI '06 Extended Abstracts on Human Factors in Computing Systems (CHI EA '06).
* Sean M. McNee, Nishikant Kapoor, and Joseph A. Konstan. 2006. ''[http://files.grouplens.org/papers/p171-mcnee.pdf Don't look stupid: avoiding pitfalls when recommending research papers].'' In Proceedings of the 2006 20th anniversary conference on Computer supported cooperative work (CSCW '06).  
* Sean M. McNee, Nishikant Kapoor, and Joseph A. Konstan. 2006. ''[http://files.grouplens.org/papers/p171-mcnee.pdf Don't look stupid: avoiding pitfalls when recommending research papers].'' In Proceedings of the 2006 20th anniversary conference on Computer supported cooperative work (CSCW '06).
* Michael D. Ekstrand and Martijn C. Willemsen. 2016. ''[https://md.ekstrandom.net/research/pubs/behaviorism/BehaviorismIsNotEnough.pdf Behaviorism is Not Enough: Better Recommendations through Listening to Users].'' In Proceedings of the 10th ACM Conference on Recommender Systems (RecSys '16).
* Shahriari, K., & Shahriari, M. (2017). ''[https://ethicsinaction.ieee.org/ IEEE standard review - Ethically aligned design: A vision for prioritizing human wellbeing with artificial intelligence and autonomous systems].'' Institute of Electrical and Electronics Engineers
* Jess Holbrook. ''[https://medium.com/google-design/human-centered-machine-learning-a770d10562cd Human Centered Machine Learning].'' Google Design Blog. 2017.
* ACM US Policy Council ''[https://www.acm.org/binaries/content/assets/public-policy/2017_usacm_statement_algorithms.pdf Statement on Algorithmic Transparency and Accountability].'' January 2017.
* Anderson, Carl. ''[https://medium.com/@leapingllamas/the-role-of-model-interpretability-in-data-science-703918f64330 The role of model interpretability in data science].'' Medium, 2016.
* Diakopoulos, N., Friedler, S., Arenas, M., Barocas, S., Hay, M., Howe, B., … Zevenbergen, B. (2018). ''[http://www.fatml.org/resources/principles-for-accountable-algorithms Principles for Accountable Algorithms and a Social Impact Statement for Algorithms].'' Fatml.Org 2018.  
*Fabien Girardin. ''[https://medium.com/@girardin/experience-design-in-the-machine-learning-era-e16c87f4f2e2 Experience design in the machine learning era].'' Medium, 2016.
* Morgan, J. 2016. ''[https://meta.wikimedia.org/wiki/Research:Evaluating_RelatedArticles_recommendations Evaluating Related Articles recommendations]''. Wikimedia Research.
* Xavier Amatriain and Justin Basilico. ''[https://medium.com/netflix-techblog/netflix-recommendations-beyond-the-5-stars-part-1-55838468f429 Netflix Recommendations: Beyond the 5 stars].'' Netflix Tech Blog, 2012.
* Morgan, J. 2017. ''[https://meta.wikimedia.org/wiki/Research:Comparing_most_read_and_trending_edits_for_Top_Articles_feature Comparing most read and trending edits for the top articles feature]''. Wikimedia Research.
* Jess Holbrook. ''[https://medium.com/google-design/human-centered-machine-learning-a770d10562cd Human Centered Machine Learning].'' Google Design Blog. 2017.
*Michael D. Ekstrand, F. Maxwell Harper, Martijn C. Willemsen, and Joseph A. Konstan. 2014. ''[https://md.ekstrandom.net/research/pubs/listcmp/listcmp.pdf User perception of differences in recommender algorithms].'' In Proceedings of the 8th ACM Conference on Recommender systems (RecSys '14).
* Bart P. Knijnenburg, Martijn C. Willemsen, Zeno Gantner, Hakan Soncu, and Chris Newell. 2012. ''[https://pure.tue.nl/ws/files/3484177/724656348730405.pdf Explaining the user experience of recommender systems].'' User Modeling and User-Adapted Interaction 22, 4-5 (October 2012), 441-504. DOI=http://dx.doi.org/10.1007/s11257-011-9118-4
* Patrick Austin, ''[https://gizmodo.com/facebook-google-and-microsoft-use-design-to-trick-you-1827168534 Facebook, Google, and Microsoft Use Design to Trick You Into Handing Over Your Data, New Report Warns].'' Gizmodo, 6/18/2018
* Cremonesi, P., Elahi, M., & Garzotto, F. (2017). ''[https://core.ac.uk/download/pdf/74313597.pdf User interface patterns in recommendation-empowered content intensive multimedia applications].'' Multimedia Tools and Applications, 76(4), 5275-5309.
<br/>
<br/>
<hr/>
<hr/>
Line 321: Line 335:


;Agenda
;Agenda
* ''coming soon''
* Filling out course evaluation
<!--
* Week 8 in-class activity report out
* Reading reflections discussion
* End of quarter logistics
* Feedback on Final Project Plans
* Final project presentations and reports
* UI patterns & UX considerations for ML/data-driven applications
* Guest lecture: Rich Caruana, Microsoft Research
* Final project presentation: what to expect
* In-class activity (InterpretML): Harsha Nori, Microsoft
* In-class activity: final project peer review
 
-->


;Homework assigned
;Homework assigned
* Read and reflect: Alkhatib, A., & Bernstein, M. (2019). ''[https://hci.stanford.edu/publications/2019/streetlevelalgorithms/streetlevelalgorithms-chi2019.pdf Street-Level Algorithms: A Theory at the Gaps Between Policy and Decisions]''. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. https://doi.org/10.1145/3290605.3300760
* Read and reflect: Passi, S., & Jackson, S. J. (2018). ''[https://dl.acm.org/citation.cfm?doid=3290265.3274405 Trust in Data Science: Collaboration, Translation, and Accountability in Corporate Data Science Projects].'' Proceedings of the ACM on Human-Computer Interaction, 2(CSCW), 1–28. https://doi.org/10.1145/3274405 ([https://sjackson.infosci.cornell.edu/Passi&Jackson_TrustinDataScience(CSCW2018).pdf ACCESS PDF HERE])
* [[Human_Centered_Data_Science_(Fall_2019)/Assignments#A7:_Final_project_report|A7: Final project report]]
* [[Human_Centered_Data_Science_(Fall_2019)/Assignments#A7:_Final_project_report|A7: Final project report]]


;Resources
;Resources
<!--
* Rich Caruana, Harsha Nori, Samuel Jenkins, Paul Koch, Ester de Nicolas. 2019. ''InterpretML software toolkit'' ([https://github.com/interpretml/interpret github repo], [https://www.microsoft.com/en-us/research/blog/creating-ai-glass-boxes-open-sourcing-a-library-to-enable-intelligibility-in-machine-learning/ blog post])
* Daniela Aiello, Lisa Bates, et al. [https://shelterforce.org/2018/08/22/eviction-lab-misses-the-mark/ Eviction Lab Misses the Mark], ShelterForce, August 2018.
* Partnership on AI, 2019 ''[https://www.partnershiponai.org/report-on-machine-learning-in-risk-assessment-tools-in-the-u-s-criminal-justice-system/ Report on Algorithmic Risk Assessment Tools in the U.S. Criminal Justice System].''
-->
* Morgan, J. T., 2019. ''[https://figshare.com/articles/Ethical_Human_Centered_AI/8044553 Ethical and Human-centered AI at Wikimedia]''. Wikimedia Research 2030​.


<br/>
<br/>

Latest revision as of 19:22, 27 November 2019

This page is a work in progress.


Week 1: September 26[edit]

Introduction to Human Centered Data Science
What is data science? What is human centered? What is human centered data science?
Assignments due
Agenda
  • Syllabus review
  • Pre-course survey results
  • What do we mean by data science?
  • What do we mean by human centered?
  • How does human centered design relate to data science?
  • In-class activity
  • Intro to assignment 1: Data Curation
Homework assigned
  • Read and reflect on both:
Resources




Week 2: October 3[edit]

Reproducibility and Accountability
data curation, preservation, documentation, and archiving; best practices for open scientific research
Assignments due
  • Week 1 reading reflection
  • A1: Data curation
Agenda
  • Reading reflection discussion
  • Assignment 1 review & reflection
  • A primer on copyright, licensing, and hosting for code and data
  • Introduction to replicability, reproducibility, and open research
  • In-class activity
  • Intro to assignment 2: Bias in data
Homework assigned
Resources




Week 3: October 10[edit]

Interrogating datasets
causes and consequences of bias in data; best practices for selecting, describing, and implementing training data
Assignments due
  • Week 2 reading reflection
Agenda
  • Reading reflection review
  • Sources and consequences of bias in data collection, processing, and re-use
  • In-class activity
Homework assigned
  • Read both, reflect on one:
Resources




Week 4: October 17[edit]

Introduction to qualitative and mixed-methods research
Big data vs thick data; integrating qualitative research methods into data science practice; crowdsourcing
Assignments due
  • Reading reflection
  • A2: Bias in data
Agenda
  • Reading reflection reflection
  • Overview of qualitative research
  • Introduction to ethnography
  • In-class activity: explaining art to aliens
  • Mixed methods research and data science
  • An introduction to crowdwork
  • Overview of assignment 3: Crowdwork ethnography
Homework assigned
Resources





Week 5: October 24[edit]

Research ethics for big data
privacy, informed consent and user treatment
Assignments due
  • Reading reflection
Agenda
  • Reading reflection review
  • Guest lecture
  • A2 retrospective
  • Final project deliverables and timeline
  • A brief history of research ethics in the United States


Homework assigned
  • Read and reflect: Gray, M. L., & Suri, S. (2019). Ghost Work: How to Stop Silicon Valley from Building a New Global Underclass. Eamon Dolan Books. (PDF available on Canvas)
Resources




Week 6: October 31[edit]

Data science and society
power, data, and society; ethics of crowdwork
Assignments due
  • Reading reflection
  • A3: Crowdwork ethnography
Agenda
  • Reading reflections
  • Assignment 3 review
  • Guest lecture: Stefania Druga
  • In-class activity
  • Introduction to assignment 4: Final project proposal
Homework assigned
  • Read both, reflect on one:
Resources




Week 7: November 7[edit]

Human centered machine learning
algorithmic fairness, transparency, and accountability; methods and contexts for algorithmic audits
Assignments due
  • Reading reflection
  • A4: Project proposal
Agenda
  • Reading reflection review
  • Algorithmic transparency, interpretability, and accountability
  • Auditing algorithms
  • In-class activity
  • Introduction to assignment 5: Final project proposal
Homework assigned
Resources




Week 8: November 14[edit]

User experience and data science
algorithmic interpretibility; human-centered methods for designing and evaluating algorithmic systems
Assignments due
  • Reading reflection
  • A5: Final project plan
Agenda
  • coming soon
Homework assigned
Resources




Week 9: November 21[edit]

Data science in context
Doing human centered datascience in product organizations; communicating and collaborating across roles and disciplines; HCDS industry trends and trajectories
Assignments due
  • Reading reflection
Agenda
  • Filling out course evaluation
  • Week 8 in-class activity report out
  • End of quarter logistics
  • Final project presentations and reports
  • Guest lecture: Rich Caruana, Microsoft Research
  • In-class activity (InterpretML): Harsha Nori, Microsoft


Homework assigned
Resources




Week 10: November 28 (No Class Session)[edit]

Assignments due
  • Reading reflection
Homework assigned
Resources




Week 11: December 5[edit]

Final presentations
presentation of student projects, course wrap up
Assignments due
  • Reading reflection
  • A5: Final presentation
Readings assigned
  • NONE
Homework assigned
  • NONE
Resources
  • NONE




Week 12: Finals Week (No Class Session)[edit]

  • NO CLASS
  • A7: FINAL PROJECT REPORT DUE BY 5:00PM on Tuesday, December 10 via Canvas
  • LATE PROJECT SUBMISSIONS NOT ACCEPTED.