Human Centered Data Science (Fall 2018)/Schedule: Difference between revisions

From CommunityData
 
(37 intermediate revisions by 3 users not shown)
Line 94: Line 94:
[[HCDS_(Fall_2018)/Day_3_plan|Day 3 plan]]
[[HCDS_(Fall_2018)/Day_3_plan|Day 3 plan]]


[[:File:HCDS_2018_week_3_slides.pdf.pdf|Day 3 slides]]
[[:File:HCDS_2018_week_3_slides.pdf|Day 3 slides]]


;Reproducibility and Accountability: ''data curation, preservation, documentation, and archiving; best practices for open scientific research''
;Reproducibility and Accountability: ''data curation, preservation, documentation, and archiving; best practices for open scientific research''
Line 137: Line 137:
[[HCDS_(Fall_2018)/Day_4_plan|Day 4 plan]]
[[HCDS_(Fall_2018)/Day_4_plan|Day 4 plan]]


<!-- [[:File:HCDS Week 4 slides.pdf|Day 4 slides]] -->
[[:File:HCDS 2018 week 4 slides.pdf|Day 4 slides]]


;Interrogating datasets: ''causes and consequences of bias in data; best practices for selecting, describing, and implementing training data''
;Interrogating datasets: ''causes and consequences of bias in data; best practices for selecting, describing, and implementing training data''
Line 155: Line 155:
;Homework assigned
;Homework assigned
* Reading reflection
* Reading reflection
* A2: Bias in data
* [[Human_Centered_Data_Science_(Fall_2018)/Assignments#A2:_Bias_in_data|A2: Bias in data]]




;Resources
;Resources
* Brian N Larson. 2017. Gender as a Variable in Natural-Language Processing: Ethical Considerations. EthNLP, 3: 30–40. Retrieved from http://www.ethicsinnlp.org/workshop/pdf/EthNLP04.pdf
* Olteanu, A., Castillo, C., Diaz, F., & Kiciman, E. (2016). ''[http://kiciman.org/wp-content/uploads/2017/08/SSRN-id2886526.pdf Social data: Biases, methodological pitfalls, and ethical boundaries].
* Bender, E. M., & Friedman, B. (2018). Data Statements for NLP: Toward Mitigating System Bias and Enabling Better Science. T0 appear in Transactions of the ACL. PDF: https://openreview.net/forum?id=By4oPeX9f
* Brian N Larson. 2017. ''[http://www.ethicsinnlp.org/workshop/pdf/EthNLP04.pdf Gender as a Variable in Natural-Language Processing: Ethical Considerations]. EthNLP, 3: 30–40.
* Astrid Mager. 2012. Algorithmic ideology: How capitalist society shapes search engines. Information, Communication & Society 15, 5: 769–787. http://doi.org/10.1080/1369118X.2012.676056 (in Canvas)
* Bender, E. M., & Friedman, B. (2018). [https://openreview.net/forum?id=By4oPeX9f Data Statements for NLP: Toward Mitigating System Bias and Enabling Better Science]. To appear in Transactions of the ACL.
* Isaac L. Johnson, Yilun Lin, Toby Jia-Jun Li, Andrew Hall, Aaron Halfaker, Johannes Schöning, and Brent Hecht. 2016. ''[http://delivery.acm.org/10.1145/2860000/2858123/p13-johnson.pdf?ip=209.166.92.236&id=2858123&acc=CHORUS&key=4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E6D218144511F3437&__acm__=1539880715_eb477907771cea4ecaabc953094c3080 Not at Home on the Range: Peer Production and the Urban/Rural Divide].'' CHI '16. DOI: https://doi.org/10.1145/2858036.2858123
* Leo Graiden Stewart, Ahmer Arif, A. Conrad Nied, Emma S. Spiro, and Kate Starbird. 2017. ''[https://faculty.washington.edu/kstarbi/Stewart_Starbird_Drawing_the_Lines_of_Contention-final.pdf Drawing the Lines of Contention: Networked Frame Contests Within #BlackLivesMatter Discourse].'' Proc. ACM Hum.-Comput. Interact. 1, CSCW, Article 96 (December 2017), 23 pages. DOI: https://doi.org/10.1145/3134920
* Cristian Danescu-Niculescu-Mizil, Robert West, Dan Jurafsky, Jure Leskovec, and Christopher Potts. 2013. ''[https://web.stanford.edu/~jurafsky/pubs/linguistic_change_lifecycle.pdf No country for old members: user lifecycle and linguistic change in online communities].'' In Proceedings of the 22nd international conference on World Wide Web (WWW '13). ACM, New York, NY, USA, 307-318. DOI: https://doi.org/10.1145/2488388.2488416 
<!-- * Astrid Mager. 2012. Algorithmic ideology: How capitalist society shapes search engines. Information, Communication & Society 15, 5: 769–787. http://doi.org/10.1080/1369118X.2012.676056 (in Canvas) -->
<br/>
<br/>
<hr/>
<hr/>
Line 169: Line 173:
[[HCDS_(Fall_2018)/Day_5_plan|Day 5 plan]]
[[HCDS_(Fall_2018)/Day_5_plan|Day 5 plan]]


<!-- [[:File:HCDS Week 5 slides.pdf|Day 5 slides]] -->
[[:File:HCDS 2018 week 5 slides.pdf|Day 5 slides]]


;Introduction to mixed-methods research: ''Big data vs thick data; integrating qualitative research methods into data science practice; crowdsourcing''
;Introduction to mixed-methods research: ''Big data vs thick data; integrating qualitative research methods into data science practice; crowdsourcing''
Line 182: Line 186:




;Readings assigned
;Readings assigned (Read both, reflect on one)
* Donovan, J., Caplan, R., Matthews, J., & Hanson, L. (2018). ''[https://datasociety.net/wp-content/uploads/2018/04/Data_Society_Algorithmic_Accountability_Primer_FINAL.pdf Algorithmic accountability: A primer]''. Data & Society, 501(c).
* Donovan, J., Caplan, R., Matthews, J., & Hanson, L. (2018). ''[https://datasociety.net/wp-content/uploads/2018/04/Data_Society_Algorithmic_Accountability_Primer_FINAL.pdf Algorithmic accountability: A primer]''. Data & Society, 501(c).


;Homework assigned
;Homework assigned
Line 190: Line 195:




;Resources
;Qualitative research methods resources
* WeArDynamo contributors. ''[http://wiki.wearedynamo.org/index.php?title=Basics_of_how_to_be_a_good_requester How to be a good requester]'' and ''[http://wiki.wearedynamo.org/index.php?title=Guidelines_for_Academic_Requesters Guidelines for Academic Requesters]''. Wearedynamo.org
* Ladner, S. (2016). ''[http://www.practicalethnography.com/ Practical ethnography: A guide to doing ethnography in the private sector]''. Routledge.
* Spradley, J. P. (2016). ''[https://www.waveland.com/browse.php?t=688 The ethnographic interview]''. Waveland Press.
* Eriksson, P., & Kovalainen, A. (2015). ''[http://study.sagepub.com/sites/default/files/Eriksson%20and%20Kovalainen.pdf Ch 12: Ethnographic Research]''. In Qualitative methods in business research: A practical guide to social research. Sage.
* Usability.gov, ''[https://www.usability.gov/how-to-and-tools/methods/system-usability-scale.html System usability scale]''.
* Nielsen, Jakob (2000). ''[https://www.nngroup.com/articles/why-you-only-need-to-test-with-5-users/ Why you only need to test with five users]''. nngroup.com.
 
;Wikipedia gender gap research resources
* Hill, B. M., & Shaw, A. (2013). ''[journals.plos.org/plosone/article?id=10.1371/journal.pone.0065782 The Wikipedia gender gap revisited: Characterizing survey response bias with propensity score estimation]''. PloS one, 8(6), e65782
* Shyong (Tony) K. Lam, Anuradha Uduwage, Zhenhua Dong, Shilad Sen, David R. Musicant, Loren Terveen, and John Riedl. 2011. ''[http://files.grouplens.org/papers/wp-gender-wikisym2011.pdf WP:clubhouse?: an exploration of Wikipedia's gender imbalance.]'' In Proceedings of the 7th International Symposium on Wikis and Open Collaboration (WikiSym '11). ACM, New York, NY, USA, 1-10. DOI=http://dx.doi.org/10.1145/2038558.2038560
* Shyong (Tony) K. Lam, Anuradha Uduwage, Zhenhua Dong, Shilad Sen, David R. Musicant, Loren Terveen, and John Riedl. 2011. ''[http://files.grouplens.org/papers/wp-gender-wikisym2011.pdf WP:clubhouse?: an exploration of Wikipedia's gender imbalance.]'' In Proceedings of the 7th International Symposium on Wikis and Open Collaboration (WikiSym '11). ACM, New York, NY, USA, 1-10. DOI=http://dx.doi.org/10.1145/2038558.2038560
* Maximillian Klein. ''[http://whgi.wmflabs.org/gender-by-language.html Gender by Wikipedia Language]''. Wikidata Human Gender Indicators (WHGI), 2017.
* Maximillian Klein. ''[http://whgi.wmflabs.org/gender-by-language.html Gender by Wikipedia Language]''. Wikidata Human Gender Indicators (WHGI), 2017.
* Source: Wagner, C., Garcia, D., Jadidi, M., & Strohmaier, M. (2015, April). ''[https://www.aaai.org/ocs/index.php/ICWSM/ICWSM15/paper/viewFile/10585/10528 It's a Man's Wikipedia? Assessing Gender Inequality in an Online Encyclopedia]''. In ICWSM (pp. 454-463).
* Benjamin Collier and Julia Bear. ''[https://static1.squarespace.com/static/521c8817e4b0dca2590b4591/t/523745abe4b05150ff027a6e/1379354027662/2012+-+Collier%2C+Bear+-+Conflict%2C+confidence%2C+or+criticism+an+empirical+examination+of+the+gender+gap+in+Wikipedia.pdf Conflict, criticism, or confidence: an empirical examination of the gender gap in wikipedia contributions]''. In Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work (CSCW '12). DOI: https://doi.org/10.1145/2145204.2145265
* Benjamin Collier and Julia Bear. ''[https://static1.squarespace.com/static/521c8817e4b0dca2590b4591/t/523745abe4b05150ff027a6e/1379354027662/2012+-+Collier%2C+Bear+-+Conflict%2C+confidence%2C+or+criticism+an+empirical+examination+of+the+gender+gap+in+Wikipedia.pdf Conflict, criticism, or confidence: an empirical examination of the gender gap in wikipedia contributions]''. In Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work (CSCW '12). DOI: https://doi.org/10.1145/2145204.2145265
* Christina Shane-Simpson, Kristen Gillespie-Lynch, Examining potential mechanisms underlying the Wikipedia gender gap through a collaborative editing task, In Computers in Human Behavior, Volume 66, 2017, https://doi.org/10.1016/j.chb.2016.09.043. (PDF on Canvas)
* Christina Shane-Simpson, Kristen Gillespie-Lynch, Examining potential mechanisms underlying the Wikipedia gender gap through a collaborative editing task, In Computers in Human Behavior, Volume 66, 2017, https://doi.org/10.1016/j.chb.2016.09.043. (PDF on Canvas)
* Amanda Menking and Ingrid Erickson. 2015. ''[https://upload.wikimedia.org/wikipedia/commons/7/77/The_Heart_Work_of_Wikipedia_Gendered,_Emotional_Labor_in_the_World%27s_Largest_Online_Encyclopedia.pdf The Heart Work of Wikipedia: Gendered, Emotional Labor in the World's Largest Online Encyclopedia]''. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15). https://doi.org/10.1145/2702123.2702514  
* Amanda Menking and Ingrid Erickson. 2015. ''[https://upload.wikimedia.org/wikipedia/commons/7/77/The_Heart_Work_of_Wikipedia_Gendered,_Emotional_Labor_in_the_World%27s_Largest_Online_Encyclopedia.pdf The Heart Work of Wikipedia: Gendered, Emotional Labor in the World's Largest Online Encyclopedia]''. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15). https://doi.org/10.1145/2702123.2702514  
;Crowdwork research resources
* WeArDynamo contributors. ''[http://wiki.wearedynamo.org/index.php?title=Basics_of_how_to_be_a_good_requester How to be a good requester]'' and ''[http://wiki.wearedynamo.org/index.php?title=Guidelines_for_Academic_Requesters Guidelines for Academic Requesters]''. Wearedynamo.org




Line 206: Line 224:
[[HCDS_(Fall_2018)/Day_6_plan|Day 6 plan]]
[[HCDS_(Fall_2018)/Day_6_plan|Day 6 plan]]


<!-- [[:File:HCDS Week 6 slides.pdf|Day 6 slides]] -->
[[:File:HCDS 2018 week 6 slides.pdf|Day 6 slides]]


;Interrogating algorithms: ''algorithmic fairness, transparency, and accountability; methods and contexts for algorithmic audits''
;Interrogating algorithms: ''algorithmic fairness, transparency, and accountability; methods and contexts for algorithmic audits''
Line 212: Line 230:
;Assignments due
;Assignments due
* Reading reflection
* Reading reflection
* A2: Bias in data
* [[Human_Centered_Data_Science_(Fall_2018)/Assignments#A2:_Bias_in_data|A2: Bias in data]]


;Agenda
;Agenda
Line 218: Line 236:


;Readings assigned
;Readings assigned
* Read: Christian Sandvig, Kevin Hamilton, Karrie Karahalios, Cedric Langbort (2014/05/22) ''[http://www-personal.umich.edu/~csandvig/research/Auditing%20Algorithms%20--%20Sandvig%20--%20ICA%202014%20Data%20and%20Discrimination%20Preconference.pdf Auditing Algorithms: Research Methods for Detecting Discrimination on Internet Platforms].'' Paper presented to "Data and Discrimination: Converting Critical Concerns into Productive Inquiry," a preconference at the 64th Annual Meeting of the International Communication Association. May 22, 2014; Seattle, WA, USA.  
* Astrid Mager. 2012. ''[https://computingeverywhere.soc.northwestern.edu/wp-content/uploads/2017/07/Mager-Algorithmic-Ideology-Required.pdf Algorithmic ideology: How capitalist society shapes search engines]''. Information, Communication & Society 15, 5: 769–787. http://doi.org/10.1080/1369118X.2012.676056
 




Line 226: Line 245:


;Resources
;Resources
* Christian Sandvig, Kevin Hamilton, Karrie Karahalios, Cedric Langbort (2014/05/22) ''[http://www-personal.umich.edu/~csandvig/research/Auditing%20Algorithms%20--%20Sandvig%20--%20ICA%202014%20Data%20and%20Discrimination%20Preconference.pdf Auditing Algorithms: Research Methods for Detecting Discrimination on Internet Platforms].'' Paper presented to "Data and Discrimination: Converting Critical Concerns into Productive Inquiry," a preconference at the 64th Annual Meeting of the International Communication Association. May 22, 2014; Seattle, WA, USA.
* Shahriari, K., & Shahriari, M. (2017). ''[https://ethicsinaction.ieee.org/ IEEE standard review - Ethically aligned design: A vision for prioritizing human wellbeing with artificial intelligence and autonomous systems].'' Institute of Electrical and Electronics Engineers
* ACM US Policy Council ''[https://www.acm.org/binaries/content/assets/public-policy/2017_usacm_statement_algorithms.pdf Statement on Algorithmic Transparency and Accountability].'' January 2017.
* ''[https://futureoflife.org/ai-principles/ Asilomar AI Principles].'' Future of Life Institute, 2017.
* Diakopoulos, N., Friedler, S., Arenas, M., Barocas, S., Hay, M., Howe, B., … Zevenbergen, B. (2018). ''[http://www.fatml.org/resources/principles-for-accountable-algorithms Principles for Accountable Algorithms and a Social Impact Statement for Algorithms].'' Fatml.Org 2018.
* Friedman, B., & Nissenbaum, H. (1996). ''[https://www.vsdesign.org/publications/pdf/64_friedman.pdf Bias in Computer Systems]''. ACM Trans. Inf. Syst., 14(3), 330–347.
* Friedman, B., & Nissenbaum, H. (1996). ''[https://www.vsdesign.org/publications/pdf/64_friedman.pdf Bias in Computer Systems]''. ACM Trans. Inf. Syst., 14(3), 330–347.
* Diakopoulos, N. (2014). Algorithmic accountability reporting: On the investigation of black boxes. Tow Center for Digital Journalism, 1–33. https://doi.org/10.1002/ejoc.201200111
* Diakopoulos, N. (2014). Algorithmic accountability reporting: On the investigation of black boxes. Tow Center for Digital Journalism, 1–33. https://doi.org/10.1002/ejoc.201200111
* Nate Matias, 2017. ''[https://medium.com/@natematias/how-anyone-can-audit-facebooks-newsfeed-b879c3e29015 How Anyone Can Audit Facebook's Newsfeed].'' Medium.com
* Hill, Kashmir. ''[https://gizmodo.com/facebook-figured-out-my-family-secrets-and-it-wont-tel-1797696163 Facebook figured out my family secrets, and it won't tell me how].'' Engadget, 2017.
* Hill, Kashmir. ''[https://gizmodo.com/facebook-figured-out-my-family-secrets-and-it-wont-tel-1797696163 Facebook figured out my family secrets, and it won't tell me how].'' Engadget, 2017.
* Blue, Violet. ''[https://www.engadget.com/2017/09/01/google-perspective-comment-ranking-system/ Google’s comment-ranking system will be a hit with the alt-right].'' Engadget, 2017.
* Blue, Violet. ''[https://www.engadget.com/2017/09/01/google-perspective-comment-ranking-system/ Google’s comment-ranking system will be a hit with the alt-right].'' Engadget, 2017.
Line 243: Line 268:
=== Week 7: November 8 ===
=== Week 7: November 8 ===
[[HCDS_(Fall_2018)/Day_7_plan|Day 7 plan]]
[[HCDS_(Fall_2018)/Day_7_plan|Day 7 plan]]
[[:File:HCDS 2018 week 7 slides.pdf|Day 7 slides]]


;Critical approaches to data science: ''power, data, and society; ethics of crowdwork''
;Critical approaches to data science: ''power, data, and society; ethics of crowdwork''
Line 278: Line 305:
[[HCDS_(Fall_2018)/Day_8_plan|Day 8 plan]]
[[HCDS_(Fall_2018)/Day_8_plan|Day 8 plan]]


<!-- [[:File:HCDS Week 8 slides.pdf|Day 8 slides]] -->
[[:File:HCDS 2018 week 8 slides.pdf|Day 8 slides]]


;Human-centered algorithm design: ''algorithmic interpretibility; human-centered methods for designing and evaluating algorithmic systems''
;Human-centered algorithm design: ''algorithmic interpretibility; human-centered methods for designing and evaluating algorithmic systems''
Line 297: Line 324:


;Resources
;Resources
* Ethical OS ''[https://ethicalos.org/wp-content/uploads/2018/08/Ethical-OS-Toolkit-2.pdf Toolkit]'' and ''[https://ethicalos.org/wp-content/uploads/2018/08/EthicalOS_Check-List_080618.pdf Risk Mitigation Checklist]''. EthicalOS.org.
* Morgan, J. 2016. ''[https://meta.wikimedia.org/wiki/Research:Evaluating_RelatedArticles_recommendations Evaluating Related Articles recommendations]''. Wikimedia Research.
* Morgan, J. 2017. ''[https://meta.wikimedia.org/wiki/Research:Comparing_most_read_and_trending_edits_for_Top_Articles_feature Comparing most read and trending edits for the top articles feature]''. Wikimedia Research.
*Michael D. Ekstrand, F. Maxwell Harper, Martijn C. Willemsen, and Joseph A. Konstan. 2014. ''[https://md.ekstrandom.net/research/pubs/listcmp/listcmp.pdf User perception of differences in recommender algorithms].'' In Proceedings of the 8th ACM Conference on Recommender systems (RecSys '14).
*Michael D. Ekstrand, F. Maxwell Harper, Martijn C. Willemsen, and Joseph A. Konstan. 2014. ''[https://md.ekstrandom.net/research/pubs/listcmp/listcmp.pdf User perception of differences in recommender algorithms].'' In Proceedings of the 8th ACM Conference on Recommender systems (RecSys '14).
* Sean M. McNee, John Riedl, and Joseph A. Konstan. 2006. ''[http://files.grouplens.org/papers/mcnee-chi06-hri.pdf Making recommendations better: an analytic model for human-recommender interaction].'' In CHI '06 Extended Abstracts on Human Factors in Computing Systems (CHI EA '06).
* Sean M. McNee, John Riedl, and Joseph A. Konstan. 2006. ''[http://files.grouplens.org/papers/mcnee-chi06-hri.pdf Making recommendations better: an analytic model for human-recommender interaction].'' In CHI '06 Extended Abstracts on Human Factors in Computing Systems (CHI EA '06).
Line 338: Line 368:
[[HCDS_(Fall_2018)/Day_10_plan|Day 10 plan]]
[[HCDS_(Fall_2018)/Day_10_plan|Day 10 plan]]


<!-- [[:File:HCDS Week 10 slides.pdf|Day 10 slides]] -->
[[:File:HCDS 2018 week 10 slides.pdf|Day 10 slides]]


;User experience and big data: ''Design considerations for machine learning applications; human centered data visualization; data storytelling''
;User experience and big data: ''Design considerations for machine learning applications; human centered data visualization; data storytelling''
Line 351: Line 381:


;Readings assigned
;Readings assigned
* Megan Risdal, ''[http://blog.kaggle.com/2016/06/29/communicating-data-science-a-guide-to-presenting-your-work/ Communicating data science: a guide to presenting your work].'' Kaggle blog, 2016.
* NONE
* Marilynn Larkin, ''[https://www.elsevier.com/connect/how-to-give-a-dynamic-scientific-presentation How to give a dynamic scientific presentation].'' Elsevier Connect, 2015.
 


;Homework assigned
;Homework assigned
* Reading reflection
* A5: Final presentation
* A5: Final presentation


;Resources
;Resources
*Fabien Girardin. ''[https://medium.com/@girardin/experience-design-in-the-machine-learning-era-e16c87f4f2e2 Experience design in the machine learning era].'' Medium, 2016.
* Xavier Amatriain and Justin Basilico. ''[https://medium.com/netflix-techblog/netflix-recommendations-beyond-the-5-stars-part-1-55838468f429 Netflix Recommendations: Beyond the 5 stars].'' Netflix Tech Blog, 2012.
* Jess Holbrook. ''[https://medium.com/google-design/human-centered-machine-learning-a770d10562cd Human Centered Machine Learning].'' Google Design Blog. 2017.
* Bart P. Knijnenburg, Martijn C. Willemsen, Zeno Gantner, Hakan Soncu, and Chris Newell. 2012. ''[https://pure.tue.nl/ws/files/3484177/724656348730405.pdf Explaining the user experience of recommender systems].'' User Modeling and User-Adapted Interaction 22, 4-5 (October 2012), 441-504. DOI=http://dx.doi.org/10.1007/s11257-011-9118-4
* Bart P. Knijnenburg, Martijn C. Willemsen, Zeno Gantner, Hakan Soncu, and Chris Newell. 2012. ''[https://pure.tue.nl/ws/files/3484177/724656348730405.pdf Explaining the user experience of recommender systems].'' User Modeling and User-Adapted Interaction 22, 4-5 (October 2012), 441-504. DOI=http://dx.doi.org/10.1007/s11257-011-9118-4
* Patrick Austin, ''[https://gizmodo.com/facebook-google-and-microsoft-use-design-to-trick-you-1827168534 Facebook, Google, and Microsoft Use Design to Trick You Into Handing Over Your Data, New Report Warns].'' Gizmodo, 6/18/2018
* Brown, A., Tuor, A., Hutchinson, B., & Nichols, N. (2018). ''[[https://arxiv.org/abs/1803.04967 Recurrent Neural Network Attention Mechanisms for Interpretable System Log Anomaly Detection].'' arXiv preprint arXiv:1803.04967.
* Cremonesi, P., Elahi, M., & Garzotto, F. (2017). ''[https://core.ac.uk/download/pdf/74313597.pdf User interface patterns in recommendation-empowered content intensive multimedia applications].'' Multimedia Tools and Applications, 76(4), 5275-5309.
* Marilynn Larkin, ''[https://www.elsevier.com/connect/how-to-give-a-dynamic-scientific-presentation How to give a dynamic scientific presentation].'' Elsevier Connect, 2015.
* Megan Risdal, ''[http://blog.kaggle.com/2016/06/29/communicating-data-science-a-guide-to-presenting-your-work/ Communicating data science: a guide to presenting your work].'' Kaggle blog, 2016.
* Megan Risdal, ''[http://blog.kaggle.com/2016/08/10/communicating-data-science-why-and-some-of-the-how-to-visualize-information/ Communicating data science: Why and how to visualize information].'' Kaggle blog, 2016.
* Megan Risdal, ''[http://blog.kaggle.com/2016/08/10/communicating-data-science-why-and-some-of-the-how-to-visualize-information/ Communicating data science: Why and how to visualize information].'' Kaggle blog, 2016.
* Megan Risdal, ''[http://blog.kaggle.com/2016/06/13/communicating-data-science-an-interview-with-a-storytelling-expert-tyler-byers/ Communicating data science: an interview with a storytelling expert].'' Kaggle blog, 2016.
* Megan Risdal, ''[http://blog.kaggle.com/2016/06/13/communicating-data-science-an-interview-with-a-storytelling-expert-tyler-byers/ Communicating data science: an interview with a storytelling expert].'' Kaggle blog, 2016.
* Richard Garber, ''[https://joyfulpublicspeaking.blogspot.com/2010/08/power-of-brief-speeches-world-war-i-and.html Power of brief speeches: World War I and the Four Minute Men].'' Joyful Public Speaking, 2010.
* Brent Dykes, ''[https://www.forbes.com/sites/brentdykes/2016/03/31/data-storytelling-the-essential-data-science-skill-everyone-needs/ Data Storytelling: The Essential Data Science Skill Everyone Needs].'' Forbes, 2016.
* Brent Dykes, ''[https://www.forbes.com/sites/brentdykes/2016/03/31/data-storytelling-the-essential-data-science-skill-everyone-needs/ Data Storytelling: The Essential Data Science Skill Everyone Needs].'' Forbes, 2016.
* Xavier Amatriain and Justin Basilico. ''[https://medium.com/netflix-techblog/netflix-recommendations-beyond-the-5-stars-part-1-55838468f429 Netflix Recommendations: Beyond the 5 stars].'' Netflix Tech Blog, 2012.
 
*Fabien Girardin. ''[https://medium.com/@girardin/experience-design-in-the-machine-learning-era-e16c87f4f2e2 Experience design in the machine learning era].'' Medium, 2016.
 
* Chen, N., Brooks, M., Kocielnik, R.,  Hong, R.,  Smith, J.,  Lin, S., Qu, Z., Aragon, C. ''[https://aisel.aisnet.org/cgi/viewcontent.cgi?article=1254&context=hicss-50 Lariat: A visual analytics tool for social media researchers to explore Twitter datasets].'' Proceedings of the 50th Hawaii International Conference on System Sciences (HICSS), Data Analytics and Data Mining for Social Media Minitrack (2017)
 
<br/>
<br/>
<hr/>
<hr/>
Line 379: Line 413:


;Assignments due
;Assignments due
* Reading reflection
* A5: Final presentation
* A5: Final presentation


Line 390: Line 423:


;Homework assigned
;Homework assigned
* none!
* A6: Final project report (due 12/9 by 11:59pm)


;Resources
;Resources

Latest revision as of 21:27, 19 December 2018

This page is a work in progress.


Week 1: September 27[edit]

Day 1 plan

Day 1 slides

Introduction to Human Centered Data Science
What is data science? What is human centered? What is human centered data science?
Assignments due
Agenda
  • Syllabus review
  • Pre-course survey results
  • What do we mean by data science?
  • What do we mean by human centered?
  • How does human centered design relate to data science?
  • Looking ahead: Week 2 assignments and topics


Readings assigned
Homework assigned
  • Reading reflection
Resources




Week 2: October 4[edit]

Day 2 plan


Ethical considerations
privacy, informed consent and user treatment


Assignments due
  • Week 1 reading reflection
Agenda
  • Intro to assignment 1: Data Curation
  • A brief history of research ethics
  • Guest lecture: Javier Salido and Mark van Hollebeke, "A Practitioners View of Privacy & Data Protection"
  • Guest lecture: Javier Salido, "Differential Privacy"
  • Contextual Integrity in data science
  • Week 2 reading reflection


Readings assigned


Homework assigned
Resources




Week 3: October 11[edit]

Day 3 plan

Day 3 slides

Reproducibility and Accountability
data curation, preservation, documentation, and archiving; best practices for open scientific research
Assignments due
  • Week 2 reading reflection
Agenda
  • Six Provocations for Big Data: Review & Reflections
  • A primer on copyright, licensing, and hosting for code and data
  • Introduction to replicability, reproducibility, and open research
  • Reproducibility case study: fivethirtyeight.com
  • Group activity: assessing reproducibility in data journalism
  • Overview of Assignment 1: Data curation


Readings assigned
Homework assigned
  • Reading reflection
Resources


Assignment 1 Data curation resources





Week 4: October 18[edit]

Day 4 plan

Day 4 slides

Interrogating datasets
causes and consequences of bias in data; best practices for selecting, describing, and implementing training data


Assignments due
Agenda
  • Final project: Goal, timeline, and deliverables.
  • Overview of assignment 2: Bias in data
  • Reading reflections review
  • Sources of bias in datasets
  • Introduction to assignment 2: Bias in data
  • Sources of bias in data collection and processing
  • In-class exercise: assessing bias in training data


Readings assigned (Read both, reflect on one)
Homework assigned


Resources




Week 5: October 25[edit]

Day 5 plan

Day 5 slides

Introduction to mixed-methods research
Big data vs thick data; integrating qualitative research methods into data science practice; crowdsourcing


Assignments due
  • Reading reflection


Agenda
  • Assignment 1 review & reflection
  • Week 4 reading reflection discussion
  • Survey of qualitative research methods
  • Mixed-methods case study #1: The Wikipedia Gender Gap: causes & consequences
  • In-class activity: Automated Gender Recognition scenarios
  • Introduction to ethnography
  • Ethnographic research case study: Structured data on Wikimedia Commons
  • Introduction to crowdwork
  • Overview of Assignment 3: Crowdwork ethnography


Readings assigned (Read both, reflect on one)


Homework assigned


Qualitative research methods resources
Wikipedia gender gap research resources
Crowdwork research resources





Week 6: November 1[edit]

Day 6 plan

Day 6 slides

Interrogating algorithms
algorithmic fairness, transparency, and accountability; methods and contexts for algorithmic audits
Assignments due
Agenda
  • Reading reflections
  • Ethical implications of crowdwork
  • Algorithmic transparency, interpretability, and accountability
  • Auditing algorithms
  • In-class activity: auditing the Perspective API


Readings assigned


Homework assigned
  • Reading reflection


Resources





Week 7: November 8[edit]

Day 7 plan

Day 7 slides

Critical approaches to data science
power, data, and society; ethics of crowdwork


Assignments due
  • Reading reflection
  • A3: Crowdwork ethnography


Agenda
  • Guest lecture: Rochelle LaPlante


Readings assigned (read both, reflect on one)
Homework assigned


Resources





Week 8: November 15[edit]

Day 8 plan

Day 8 slides

Human-centered algorithm design
algorithmic interpretibility; human-centered methods for designing and evaluating algorithmic systems


Assignments due
  • Reading reflection


Agenda
  • Final project overview & examples
  • Guest Lecture: Kelly Franznick, Blink UX
  • Reading reflections
  • Human-centered algorithm design
  • design process
  • user-driven evaluation
  • design patterns & anti-patterns


Readings assigned
Homework assigned
  • Reading reflection
Resources





Week 9: November 22 (No Class Session)[edit]

Day 9 plan

Data science for social good
Community-based and participatory approaches to data science; Using data science for society's benefit
Assignments due
  • Reading reflection
  • A4: Final project plan
Agenda
  • Reading reflections discussion
  • Feedback on Final Project Plans
  • Guest lecture: Steven Drucker (Microsoft Research)
  • UI patterns & UX considerations for ML/data-driven applications
  • Final project presentation: what to expect
  • In-class activity: final project peer review


Readings assigned
Homework assigned
  • Reading reflection
Resources





Week 10: November 29[edit]

Day 10 plan

Day 10 slides

User experience and big data
Design considerations for machine learning applications; human centered data visualization; data storytelling


Assignments due
  • Reading reflection


Agenda
  • Reading reflections discussion
  • Feedback on Final Project Plans
  • Guest lecture: Steven Drucker (Microsoft Research)
  • UI patterns & UX considerations for ML/data-driven applications
  • Final project presentation: what to expect
  • In-class activity: final project peer review


Readings assigned
  • NONE
Homework assigned
  • A5: Final presentation
Resources





Week 11: December 6[edit]

Day 11 plan

Final presentations
course wrap up, presentation of student projects


Assignments due
  • A5: Final presentation


Agenda
  • Student final presentations
  • Course wrap-up


Readings assigned
  • none!
Homework assigned
  • A6: Final project report (due 12/9 by 11:59pm)
Resources
  • one




Week 12: Finals Week (No Class Session)[edit]

  • NO CLASS
  • A6: FINAL PROJECT REPORT DUE BY 11:59PM on Sunday, December 9
  • LATE PROJECT SUBMISSIONS NOT ACCEPTED.