Not logged in
Talk
Contributions
Create account
Log in
Navigation
Main page
About
People
Publications
Teaching
Resources
Research Blog
Wiki Functions
Recent changes
Help
Licensing
Page
Discussion
Edit
View history
Editing
Human Centered Data Science (Fall 2018)/Schedule
(section)
From CommunityData
Jump to:
navigation
,
search
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Week 6: November 1 === [[HCDS_(Fall_2018)/Day_6_plan|Day 6 plan]] [[:File:HCDS 2018 week 6 slides.pdf|Day 6 slides]] ;Interrogating algorithms: ''algorithmic fairness, transparency, and accountability; methods and contexts for algorithmic audits'' ;Assignments due * Reading reflection * [[Human_Centered_Data_Science_(Fall_2018)/Assignments#A2:_Bias_in_data|A2: Bias in data]] ;Agenda {{:HCDS (Fall 2018)/Day 6 plan}} ;Readings assigned * Astrid Mager. 2012. ''[https://computingeverywhere.soc.northwestern.edu/wp-content/uploads/2017/07/Mager-Algorithmic-Ideology-Required.pdf Algorithmic ideology: How capitalist society shapes search engines]''. Information, Communication & Society 15, 5: 769–787. http://doi.org/10.1080/1369118X.2012.676056 ;Homework assigned * Reading reflection ;Resources * Christian Sandvig, Kevin Hamilton, Karrie Karahalios, Cedric Langbort (2014/05/22) ''[http://www-personal.umich.edu/~csandvig/research/Auditing%20Algorithms%20--%20Sandvig%20--%20ICA%202014%20Data%20and%20Discrimination%20Preconference.pdf Auditing Algorithms: Research Methods for Detecting Discrimination on Internet Platforms].'' Paper presented to "Data and Discrimination: Converting Critical Concerns into Productive Inquiry," a preconference at the 64th Annual Meeting of the International Communication Association. May 22, 2014; Seattle, WA, USA. * Shahriari, K., & Shahriari, M. (2017). ''[https://ethicsinaction.ieee.org/ IEEE standard review - Ethically aligned design: A vision for prioritizing human wellbeing with artificial intelligence and autonomous systems].'' Institute of Electrical and Electronics Engineers * ACM US Policy Council ''[https://www.acm.org/binaries/content/assets/public-policy/2017_usacm_statement_algorithms.pdf Statement on Algorithmic Transparency and Accountability].'' January 2017. * ''[https://futureoflife.org/ai-principles/ Asilomar AI Principles].'' Future of Life Institute, 2017. * Diakopoulos, N., Friedler, S., Arenas, M., Barocas, S., Hay, M., Howe, B., … Zevenbergen, B. (2018). ''[http://www.fatml.org/resources/principles-for-accountable-algorithms Principles for Accountable Algorithms and a Social Impact Statement for Algorithms].'' Fatml.Org 2018. * Friedman, B., & Nissenbaum, H. (1996). ''[https://www.vsdesign.org/publications/pdf/64_friedman.pdf Bias in Computer Systems]''. ACM Trans. Inf. Syst., 14(3), 330–347. * Diakopoulos, N. (2014). Algorithmic accountability reporting: On the investigation of black boxes. Tow Center for Digital Journalism, 1–33. https://doi.org/10.1002/ejoc.201200111 * Nate Matias, 2017. ''[https://medium.com/@natematias/how-anyone-can-audit-facebooks-newsfeed-b879c3e29015 How Anyone Can Audit Facebook's Newsfeed].'' Medium.com * Hill, Kashmir. ''[https://gizmodo.com/facebook-figured-out-my-family-secrets-and-it-wont-tel-1797696163 Facebook figured out my family secrets, and it won't tell me how].'' Engadget, 2017. * Blue, Violet. ''[https://www.engadget.com/2017/09/01/google-perspective-comment-ranking-system/ Google’s comment-ranking system will be a hit with the alt-right].'' Engadget, 2017. * Ingold, David and Soper, Spencer. ''[https://www.bloomberg.com/graphics/2016-amazon-same-day/ Amazon Doesn’t Consider the Race of Its Customers. Should It?].'' Bloomberg, 2016. * Paul Lamere. ''[https://musicmachinery.com/2011/05/14/how-good-is-googles-instant-mix/ How good is Google's Instant Mix?].'' Music Machinery, 2011. * Julia Angwin, Jeff Larson, Surya Mattu and Lauren Kirchner. ''[https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing Machine Bias: Risk Assessment in Criminal Sentencing]. Propublica, May 2018. * [https://www.perspectiveapi.com/#/ Google's Perspective API] <br/> <hr/> <br/>
Summary:
Please note that all contributions to CommunityData are considered to be released under the Attribution-Share Alike 3.0 Unported (see
CommunityData:Copyrights
for details). If you do not want your writing to be edited mercilessly and redistributed at will, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource.
Do not submit copyrighted work without permission!
To protect the wiki against automated edit spam, we kindly ask you to solve the following CAPTCHA:
Cancel
Editing help
(opens in new window)
Tools
What links here
Related changes
Special pages
Page information