User:Groceryheist/drafts/Data Science Syllabus: Difference between revisions

From CommunityData
No edit summary
No edit summary
Line 257: Line 257:
;Homework assigned
;Homework assigned
* Reading reflection
* Reading reflection
* [[Human_Centered_Data_Science_(Fall_2018)/Assignments#A3:_Crowdwork_ethnography|A3: Crowdwork ethnography]]
<!-- * [[Human_Centered_Data_Science_(Fall_2018)/Assignments#A3:_Crowdwork_ethnography|A3: Crowdwork ethnography]] -->




Line 284: Line 284:
<hr/>
<hr/>
<br/>
<br/>
=== Week 8: November 15 ===
[[HCDS_(Fall_2018)/Day_8_plan|Day 8 plan]]
[[:File:HCDS 2018 week 8 slides.pdf|Day 8 slides]]
;Human-centered algorithm design: ''algorithmic interpretibility; human-centered methods for designing and evaluating algorithmic systems''
;Assignments due
* Reading reflection
<!-- ;Agenda -->
<!-- {{:HCDS (Fall 2018)/Day 8 plan}} -->


=== Week 6: November 1 ===
=== Week 6: November 1 ===
Line 290: Line 306:
[[:File:HCDS 2018 week 6 slides.pdf|Day 6 slides]]
[[:File:HCDS 2018 week 6 slides.pdf|Day 6 slides]]


;Interrogating algorithms: ''algorithmic fairness, transparency, and accountability; methods and contexts for algorithmic audits''
; Algorithms: ''algorithmic fairness, transparency, and accountability; methods and contexts for algorithmic audits''


;Assignments due
;Assignments due
Line 304: Line 320:
* Reading reflection
* Reading reflection


;Resources
<!-- ;Resources -->
* Christian Sandvig, Kevin Hamilton, Karrie Karahalios, Cedric Langbort (2014/05/22) ''[http://www-personal.umich.edu/~csandvig/research/Auditing%20Algorithms%20--%20Sandvig%20--%20ICA%202014%20Data%20and%20Discrimination%20Preconference.pdf Auditing Algorithms: Research Methods for Detecting Discrimination on Internet Platforms].'' Paper presented to "Data and Discrimination: Converting Critical Concerns into Productive Inquiry," a preconference at the 64th Annual Meeting of the International Communication Association. May 22, 2014; Seattle, WA, USA.  
<!-- * Hill, B. M., Dailey, D., Guy, R. T., Lewis, B., Matsuzaki, M., & Morgan, J. T. (2017). ''[https://mako.cc/academic/hill_etal-cdsw_chapter-DRAFT.pdf Democratizing Data Science: The Community Data Science Workshops and Classes].'' In N. Jullien, S. A. Matei, & S. P. Goggins (Eds.), Big Data Factories: Scientific Collaborative approaches for virtual community data collection, repurposing, recombining, and dissemination. -->
* Shahriari, K., & Shahriari, M. (2017). ''[https://ethicsinaction.ieee.org/ IEEE standard review - Ethically aligned design: A vision for prioritizing human wellbeing with artificial intelligence and autonomous systems].'' Institute of Electrical and Electronics Engineers  
<!-- * Ethical OS ''[https://ethicalos.org/wp-content/uploads/2018/08/Ethical-OS-Toolkit-2.pdf Toolkit]'' and ''[https://ethicalos.org/wp-content/uploads/2018/08/EthicalOS_Check-List_080618.pdf Risk Mitigation Checklist]''. EthicalOS.org. -->
* ACM US Policy Council ''[https://www.acm.org/binaries/content/assets/public-policy/2017_usacm_statement_algorithms.pdf Statement on Algorithmic Transparency and Accountability].'' January 2017.
<!-- * Morgan, J. 2016. ''[https://meta.wikimedia.org/wiki/Research:Evaluating_RelatedArticles_recommendations Evaluating Related Articles recommendations]''. Wikimedia Research. -->
* ''[https://futureoflife.org/ai-principles/ Asilomar AI Principles].'' Future of Life Institute, 2017.
<!-- * Morgan, J. 2017. ''[https://meta.wikimedia.org/wiki/Research:Comparing_most_read_and_trending_edits_for_Top_Articles_feature Comparing most read and trending edits for the top articles feature]''. Wikimedia Research. -->
* Diakopoulos, N., Friedler, S., Arenas, M., Barocas, S., Hay, M., Howe, B., … Zevenbergen, B. (2018). ''[http://www.fatml.org/resources/principles-for-accountable-algorithms Principles for Accountable Algorithms and a Social Impact Statement for Algorithms].'' Fatml.Org 2018.
<!-- *Michael D. Ekstrand, F. Maxwell Harper, Martijn C. Willemsen, and Joseph A. Konstan. 2014. ''[https://md.ekstrandom.net/research/pubs/listcmp/listcmp.pdf User perception of differences in recommender algorithms].'' In Proceedings of the 8th ACM Conference on Recommender systems (RecSys '14). -->
* Friedman, B., & Nissenbaum, H. (1996). ''[https://www.vsdesign.org/publications/pdf/64_friedman.pdf Bias in Computer Systems]''. ACM Trans. Inf. Syst., 14(3), 330–347.
<!-- * Sean M. McNee, John Riedl, and Joseph A. Konstan. 2006. ''[http://files.grouplens.org/papers/mcnee-chi06-hri.pdf Making recommendations better: an analytic model for human-recommender interaction].'' In CHI '06 Extended Abstracts on Human Factors in Computing Systems (CHI EA '06). -->
* Diakopoulos, N. (2014). Algorithmic accountability reporting: On the investigation of black boxes. Tow Center for Digital Journalism, 1–33. https://doi.org/10.1002/ejoc.201200111
<!-- * Sean M. McNee, Nishikant Kapoor, and Joseph A. Konstan. 2006. ''[http://files.grouplens.org/papers/p171-mcnee.pdf Don't look stupid: avoiding pitfalls when recommending research papers].'' In Proceedings of the 2006 20th anniversary conference on Computer supported cooperative work (CSCW '06).  -->
* Nate Matias, 2017. ''[https://medium.com/@natematias/how-anyone-can-audit-facebooks-newsfeed-b879c3e29015 How Anyone Can Audit Facebook's Newsfeed].'' Medium.com
<!-- * Michael D. Ekstrand and Martijn C. Willemsen. 2016. ''[https://md.ekstrandom.net/research/pubs/behaviorism/BehaviorismIsNotEnough.pdf Behaviorism is Not Enough: Better Recommendations through Listening to Users].'' In Proceedings of the 10th ACM Conference on Recommender Systems (RecSys '16). -->
* Hill, Kashmir. ''[https://gizmodo.com/facebook-figured-out-my-family-secrets-and-it-wont-tel-1797696163 Facebook figured out my family secrets, and it won't tell me how].'' Engadget, 2017.
<!-- * Jess Holbrook. ''[https://medium.com/google-design/human-centered-machine-learning-a770d10562cd Human Centered Machine Learning].'' Google Design Blog. 2017. -->
* Blue, Violet. ''[https://www.engadget.com/2017/09/01/google-perspective-comment-ranking-system/ Google’s comment-ranking system will be a hit with the alt-right].'' Engadget, 2017.
<!-- * Anderson, Carl. ''[https://medium.com/@leapingllamas/the-role-of-model-interpretability-in-data-science-703918f64330 The role of model interpretability in data science].'' Medium, 2016. -->
* Ingold, David and Soper, Spencer. ''[https://www.bloomberg.com/graphics/2016-amazon-same-day/ Amazon Doesn’t Consider the Race of Its Customers. Should It?].'' Bloomberg, 2016.
<!-- * Christian Sandvig, Kevin Hamilton, Karrie Karahalios, Cedric Langbort (2014/05/22) ''[http://www-personal.umich.edu/~csandvig/research/Auditing%20Algorithms%20--%20Sandvig%20--%20ICA%202014%20Data%20and%20Discrimination%20Preconference.pdf Auditing Algorithms: Research Methods for Detecting Discrimination on Internet Platforms].'' Paper presented to "Data and Discrimination: Converting Critical Concerns into Productive Inquiry," a preconference at the 64th Annual Meeting of the International Communication Association. May 22, 2014; Seattle, WA, USA. -->
* Paul Lamere. ''[https://musicmachinery.com/2011/05/14/how-good-is-googles-instant-mix/ How good is Google's Instant Mix?].'' Music Machinery, 2011.
<!-- * Shahriari, K., & Shahriari, M. (2017). ''[https://ethicsinaction.ieee.org/ IEEE standard review - Ethically aligned design: A vision for prioritizing human wellbeing with artificial intelligence and autonomous systems].'' Institute of Electrical and Electronics Engineers -->
* Julia Angwin, Jeff Larson, Surya Mattu and Lauren Kirchner. ''[https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing Machine Bias: Risk Assessment in Criminal Sentencing]. Propublica, May 2018.
<!-- * ACM US Policy Council ''[https://www.acm.org/binaries/content/assets/public-policy/2017_usacm_statement_algorithms.pdf Statement on Algorithmic Transparency and Accountability].'' January 2017. -->
* [https://www.perspectiveapi.com/#/ Google's Perspective API]
<!-- * ''[https://futureoflife.org/ai-principles/ Asilomar AI Principles].'' Future of Life Institute, 2017. -->
<!-- * Diakopoulos, N., Friedler, S., Arenas, M., Barocas, S., Hay, M., Howe, B., … Zevenbergen, B. (2018). ''[http://www.fatml.org/resources/principles-for-accountable-algorithms Principles for Accountable Algorithms and a Social Impact Statement for Algorithms].'' Fatml.Org 2018. -->
<!-- * Friedman, B., & Nissenbaum, H. (1996). ''[https://www.vsdesign.org/publications/pdf/64_friedman.pdf Bias in Computer Systems]''. ACM Trans. Inf. Syst., 14(3), 330–347. -->
<!-- * Diakopoulos, N. (2014). Algorithmic accountability reporting: On the investigation of black boxes. Tow Center for Digital Journalism, 1–33. https://doi.org/10.1002/ejoc.201200111 -->
<!-- * Nate Matias, 2017. ''[https://medium.com/@natematias/how-anyone-can-audit-facebooks-newsfeed-b879c3e29015 How Anyone Can Audit Facebook's Newsfeed].'' Medium.com -->
<!-- * Hill, Kashmir. ''[https://gizmodo.com/facebook-figured-out-my-family-secrets-and-it-wont-tel-1797696163 Facebook figured out my family secrets, and it won't tell me how].'' Engadget, 2017. -->
<!-- * Blue, Violet. ''[https://www.engadget.com/2017/09/01/google-perspective-comment-ranking-system/ Google’s comment-ranking system will be a hit with the alt-right].'' Engadget, 2017. -->
<!-- * Ingold, David and Soper, Spencer. ''[https://www.bloomberg.com/graphics/2016-amazon-same-day/ Amazon Doesn’t Consider the Race of Its Customers. Should It?].'' Bloomberg, 2016. -->
<!-- * Paul Lamere. ''[https://musicmachinery.com/2011/05/14/how-good-is-googles-instant-mix/ How good is Google's Instant Mix?].'' Music Machinery, 2011. -->
<!-- * Julia Angwin, Jeff Larson, Surya Mattu and Lauren Kirchner. ''[https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing Machine Bias: Risk Assessment in Criminal Sentencing]. Propublica, May 2018. -->
<!-- * [https://www.perspectiveapi.com/#/ Google's Perspective API] -->




Line 326: Line 352:
<br/>
<br/>


=== Week 7: November 8 ===
<!-- === Week 7 === -->
[[HCDS_(Fall_2018)/Day_7_plan|Day 7 plan]]
<!-- <\!-- [[HCDS_(Fall_2018)/Day_7_plan|Day 7 plan]] -\-> -->
 
[[:File:HCDS 2018 week 7 slides.pdf|Day 7 slides]]


;Critical approaches to data science: ''power, data, and society; ethics of crowdwork''
<!-- <\!-- [[:File:HCDS 2018 week 7 slides.pdf|Day 7 slides]] -\-> -->


<!-- ;Critical approaches to data science: ''power, data, and society; ethics of crowdwork'' -->


;Assignments due
<!-- ;Assignments due -->
* Reading reflection
<!-- * Reading reflection -->
* [[Human_Centered_Data_Science_(Fall_2018)/Assignments#A3:_Crowdwork_ethnography|A3: Crowdwork ethnography]]
<!-- <\!-- * [[Human_Centered_Data_Science_(Fall_2018)/Assignments#A3:_Crowdwork_ethnography|A3: Crowdwork ethnography]] -\-> -->


<!-- ;Agenda -->
<!-- <\!-- ;Agenda -\-> -->
<!-- {{:HCDS (Fall 2018)/Day 7 plan}} -->
<!-- <\!-- {{:HCDS (Fall 2018)/Day 7 plan}} -\-> -->


;Readings assigned (read both, reflect on one)
<!-- ;Readings assigned (read both, reflect on one) -->
* Read: Baumer, E. P. S. (2017). ''[http://journals.sagepub.com/doi/pdf/10.1177/2053951717718854 Toward human-centered algorithm design].'' Big Data & Society.
<!-- * Read: Baumer, E. P. S. (2017). ''[http://journals.sagepub.com/doi/pdf/10.1177/2053951717718854 Toward human-centered algorithm design].'' Big Data & Society. -->
* Read: Amershi, S., Cakmak, M., Knox, W. B., & Kulesza, T. (2014). ''[http://www.aaai.org/ojs/index.php/aimagazine/article/download/2513/2456 Power to the People: The Role of Humans in Interactive Machine Learning].'' AI Magazine, 35(4), 105.
<!-- * Read: Amershi, S., Cakmak, M., Knox, W. B., & Kulesza, T. (2014). ''[http://www.aaai.org/ojs/index.php/aimagazine/article/download/2513/2456 Power to the People: The Role of Humans in Interactive Machine Learning].'' AI Magazine, 35(4), 105. -->
<!-- ;Readings assigned -->


;Homework assigned
<!-- ;Homework assigned -->
* Reading reflection
<!-- * Reading reflection -->
* [[Human_Centered_Data_Science_(Fall_2018)/Assignments#A4:_Final_project_plan|A4: Final project plan]]
<!-- * [[Human_Centered_Data_Science_(Fall_2018)/Assignments#A4:_Final_project_plan|A4: Final project plan]] -->


<!-- ;Resources -->
<!-- ;Resources -->
Line 354: Line 380:
<!-- * Bivens, R. and Haimson, O.L. 2016. ''[http://journals.sagepub.com/doi/pdf/10.1177/2056305116672486 Baking Gender Into Social Media Design: How Platforms Shape Categories for Users and Advertisers]''. Social Media + Society. 2, 4 (2016), 205630511667248. DOI:https://doi.org/10.1177/2056305116672486.  -->
<!-- * Bivens, R. and Haimson, O.L. 2016. ''[http://journals.sagepub.com/doi/pdf/10.1177/2056305116672486 Baking Gender Into Social Media Design: How Platforms Shape Categories for Users and Advertisers]''. Social Media + Society. 2, 4 (2016), 205630511667248. DOI:https://doi.org/10.1177/2056305116672486.  -->
<!-- * Schlesinger, A. et al. 2017. ''[http://arischlesinger.com/wp-content/uploads/2017/03/chi2017-schlesinger-intersectionality.pdf Intersectional HCI: Engaging Identity through Gender, Race, and Class].'' Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems - CHI ’17. (2017), 5412–5427. DOI:https://doi.org/10.1145/3025453.3025766. -->
<!-- * Schlesinger, A. et al. 2017. ''[http://arischlesinger.com/wp-content/uploads/2017/03/chi2017-schlesinger-intersectionality.pdf Intersectional HCI: Engaging Identity through Gender, Race, and Class].'' Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems - CHI ’17. (2017), 5412–5427. DOI:https://doi.org/10.1145/3025453.3025766. -->
<br/>
<hr/>
<br/>
=== Week 8: November 15 ===
[[HCDS_(Fall_2018)/Day_8_plan|Day 8 plan]]
[[:File:HCDS 2018 week 8 slides.pdf|Day 8 slides]]
;Human-centered algorithm design: ''algorithmic interpretibility; human-centered methods for designing and evaluating algorithmic systems''
;Assignments due
* Reading reflection
<!-- ;Agenda -->
<!-- {{:HCDS (Fall 2018)/Day 8 plan}} -->
;Readings assigned
* Hill, B. M., Dailey, D., Guy, R. T., Lewis, B., Matsuzaki, M., & Morgan, J. T. (2017). ''[https://mako.cc/academic/hill_etal-cdsw_chapter-DRAFT.pdf Democratizing Data Science: The Community Data Science Workshops and Classes].'' In N. Jullien, S. A. Matei, & S. P. Goggins (Eds.), Big Data Factories: Scientific Collaborative approaches for virtual community data collection, repurposing, recombining, and dissemination.
;Homework assigned
* Reading reflection
;Resources
* Ethical OS ''[https://ethicalos.org/wp-content/uploads/2018/08/Ethical-OS-Toolkit-2.pdf Toolkit]'' and ''[https://ethicalos.org/wp-content/uploads/2018/08/EthicalOS_Check-List_080618.pdf Risk Mitigation Checklist]''. EthicalOS.org.
* Morgan, J. 2016. ''[https://meta.wikimedia.org/wiki/Research:Evaluating_RelatedArticles_recommendations Evaluating Related Articles recommendations]''. Wikimedia Research.
* Morgan, J. 2017. ''[https://meta.wikimedia.org/wiki/Research:Comparing_most_read_and_trending_edits_for_Top_Articles_feature Comparing most read and trending edits for the top articles feature]''. Wikimedia Research.
*Michael D. Ekstrand, F. Maxwell Harper, Martijn C. Willemsen, and Joseph A. Konstan. 2014. ''[https://md.ekstrandom.net/research/pubs/listcmp/listcmp.pdf User perception of differences in recommender algorithms].'' In Proceedings of the 8th ACM Conference on Recommender systems (RecSys '14).
* Sean M. McNee, John Riedl, and Joseph A. Konstan. 2006. ''[http://files.grouplens.org/papers/mcnee-chi06-hri.pdf Making recommendations better: an analytic model for human-recommender interaction].'' In CHI '06 Extended Abstracts on Human Factors in Computing Systems (CHI EA '06).
* Sean M. McNee, Nishikant Kapoor, and Joseph A. Konstan. 2006. ''[http://files.grouplens.org/papers/p171-mcnee.pdf Don't look stupid: avoiding pitfalls when recommending research papers].'' In Proceedings of the 2006 20th anniversary conference on Computer supported cooperative work (CSCW '06).
* Michael D. Ekstrand and Martijn C. Willemsen. 2016. ''[https://md.ekstrandom.net/research/pubs/behaviorism/BehaviorismIsNotEnough.pdf Behaviorism is Not Enough: Better Recommendations through Listening to Users].'' In Proceedings of the 10th ACM Conference on Recommender Systems (RecSys '16).
* Jess Holbrook. ''[https://medium.com/google-design/human-centered-machine-learning-a770d10562cd Human Centered Machine Learning].'' Google Design Blog. 2017.
* Anderson, Carl. ''[https://medium.com/@leapingllamas/the-role-of-model-interpretability-in-data-science-703918f64330 The role of model interpretability in data science].'' Medium, 2016.





Revision as of 03:15, 9 February 2019

Data Science and Organizational Communication
Principal instructor
Nate TeBlunthuis
Course Catalog Description
Fundamental principles of data science and its implications, including research ethics; data privacy; legal frameworks; algorithmic bias, transparency, fairness and accountability; data provenance, curation, preservation, and reproducibility; human computation; data communication and visualization; the role of data science in organizational context and the societal impacts of data science.

Course Description

The rise of "data science" reflects a broad and ongoing shift in how many teams, organizational leaders, communities of practice, and entire industries create and use knowledge. This class teaches "data science" as practiced by data-intensive knowledge workers but also as it is positioned in historical, organizational, institutional, and societal contexts. Students will gain an appriciation for the technical and intellectual aspects of data science, consider critical questions about how data science is often practiced, and envision ethical and effective science practice in their current and future organiational roles. The format of the class will be a mix of lecture, discussion, in-class activities, and qualitative and quantitative research assignments.

The course is designed around two high-stakes projects. In the first stage of the students will attend the Community Data Science Workshop (CDSC). I am one of the organizers and instructors of this three week intensive workshop on basic programming and data analysis skills. The first course project is to apply these skills together with the conceptual material from this course we have covered so far to conduct an original data analysis on a topic of the student's interest. The second high-stakes project is a critical analysis of an organization or work team. For this project students will serve as consultants to an organizational unit involved in data science. Through interviews and workplace observations they will gain an understanding of the socio-technical and organizational context of their team. They will then synthesize this understanding with the knowledge they gained from the course material to compose a report offering actionable insights to their team.

Learning Objectives

By the end of this course, students will be able to:

  • Understand what it means to analyze large and complex data effectively and ethically with an understanding of human, societal, organizational, and socio-technical contexts.
  • Consider the account ethical, social, organizational, and legal considerations of data science in organizational and institutional contexts.
  • Combine quantitative and qualitative data to generate critical insights into human behavior.
  • Discuss and evaluate ethical, social, organizational and legal trade-offs of different data analysis, testing, curation, and sharing methods.


Schedule

Course schedule (click to expand)

This page is a work in progress.





Week 1:

Introduction to Human Centered Data Science
What is data science? What is human centered? What is human centered data science?
Assignments due


Readings assigned
Homework assigned
  • Reading reflection
  • Attend week 2 of CDSW





Week 2:

Ethical considerations
privacy, informed consent and user treatment
Assignments due
  • Week 1 reading reflection


Readings assigned
Homework assigned





Week 3

Reproducibility and Accountability
data curation, preservation, documentation, and archiving; best practices for open scientific research
Assignments due
  • Week 2 reading reflection
  • Attend week 2 of CDSW


Readings assigned
Homework assigned
  • Reading reflection
  • Attend week 3 of CDSW







Week 4: October 18

Interrogating datasets
causes and consequences of bias in data; best practices for selecting, describing, and implementing training data


Assignments due
  • Reading reflection


Readings assigned (Read both, reflect on one)
  • Barley, S. R. (1986). Technology as an occasion for structuring: evidence from observations of ct scanners and the social order of radiology departments. Administrative Science Quarterly, 31(1), 78–108.
  • Orlikowski, W. J., & Barley, S. R. (2001). Technology and institutions: what can research on information technology and research on organizations learn from each other? MIS Q., 25(2), 145–165. https://doi.org/10.2307/3250927
Homework assigned


Resources





Week 5:

Technology and Organizing
Assignments due


Readings assigned
  • Passi, S., & Jackson, S. J. (2018). Trust in Data Science: Collaboration, Translation, and Accountability in Corporate Data Science Projects. Proc. ACM Hum.-Comput. Interact., 2(CSCW), 136:1–136:28. https://doi.org/10.1145/3274405
Homework Assigned




Week 6:

Data science in Organizational Contexts
Assignments due
Readings assigned (Read both, reflect on one)




Week 7: October 25

Day 5 plan

Day 5 slides

Introduction to mixed-methods research
Big data vs thick data; integrating qualitative research methods into data science practice; crowdsourcing


Assignments due
  • Reading reflection


Readings assigned (Read both, reflect on one)


Homework assigned
  • Reading reflection








Week 8: November 15

Day 8 plan

Day 8 slides

Human-centered algorithm design
algorithmic interpretibility; human-centered methods for designing and evaluating algorithmic systems
Assignments due
  • Reading reflection



Week 6: November 1

Day 6 plan

Day 6 slides

Algorithms
algorithmic fairness, transparency, and accountability; methods and contexts for algorithmic audits
Assignments due
  • Reading reflection


Readings assigned
Homework assigned
  • Reading reflection














Week 9: November 22 (No Class Session)

Day 9 plan

Data science for social good
Community-based and participatory approaches to data science; Using data science for society's benefit
Assignments due
  • Reading reflection
  • A4: Final project plan
Agenda
  • Reading reflections discussion
  • Feedback on Final Project Plans
  • Guest lecture: Steven Drucker (Microsoft Research)
  • UI patterns & UX considerations for ML/data-driven applications
  • Final project presentation: what to expect
  • In-class activity: final project peer review


Readings assigned
Homework assigned
  • Reading reflection
Resources






Week 10: November 29

Day 10 plan

Day 10 slides

User experience and big data
Design considerations for machine learning applications; human centered data visualization; data storytelling
Assignments due
  • Reading reflection
Agenda
  • Reading reflections discussion
  • Feedback on Final Project Plans
  • Guest lecture: Steven Drucker (Microsoft Research)
  • UI patterns & UX considerations for ML/data-driven applications
  • Final project presentation: what to expect
  • In-class activity: final project peer review


Readings assigned
  • NONE
Homework assigned
  • A5: Final presentation
Resources





Week 11: December 6

Day 11 plan

Final presentations
course wrap up, presentation of student projects


Assignments due
  • A5: Final presentation


Agenda
  • Student final presentations
  • Course wrap-up


Readings assigned
  • none!
Homework assigned
  • A6: Final project report (by 11:59pm)
Resources
  • one




Week 12: Finals Week (No Class Session)

  • NO CLASS
  • A6: FINAL PROJECT REPORT DUE BY 11:59PM