Community Data Science Workshops (Core)/Day 3 Lecture: Difference between revisions

From CommunityData
No edit summary
Line 1: Line 1:
Welcome to the Saturday lecture section of the Community Data Science Workshop Session 3! For about 140 minutes, we'll work through an example of a Python program end-to-end that answers of simple questions using data from the Wikipedia API via both a lecture and hand-on exercises.
Welcome to the Saturday lecture section of the Community Data Science Workshop Session 3! For about 140 minutes, we'll work through an example of a Python program end-to-end that answers of simple questions using data from the Wikipedia API via both a lecture and hand-on exercises.
== Resources ==
* [https://communitydata.cc/~mako/cdsw-sp2016-lecture3-20160507.ogv Screencast/recording of the lecture] (934MB) — The file should be viewable in Firefox and many other browsers. If you have trouble playing it, you can download [https://www.videolan.org/vlc/index.html the VLC media player] which will be a able to play it on Windows, OSX, or GNU/Linux.


== Material for the lecture ==
== Material for the lecture ==
Line 62: Line 58:
** Export dataset on articles over users
** Export dataset on articles over users
** Load data into Google Docs
** Load data into Google Docs
== Older Resources ==
* [https://communitydata.cc/~mako/cdsw-sp2016-lecture3-20160507.ogv Screencast/recording of the 2015 lecture] (934MB) — The file should be viewable in Firefox and many other browsers. If you have trouble playing it, you can download [https://www.videolan.org/vlc/index.html the VLC media player] which will be a able to play it on Windows, OSX, or GNU/Linux.


[[Category:Shared_Content]]
[[Category:Shared_Content]]

Revision as of 18:37, 15 February 2020

Welcome to the Saturday lecture section of the Community Data Science Workshop Session 3! For about 140 minutes, we'll work through an example of a Python program end-to-end that answers of simple questions using data from the Wikipedia API via both a lecture and hand-on exercises.

Material for the lecture

For the lecture, you will need two files. Download both of these to your computer by using right or control click on the link and then using Save as or Save link as. Keep track of where you put the files.

Overview of the day

  • Lecture
    • Our philosophy around data analysis and visualization
    • Introduce some new programming tools!
    • We're going to walk through some analysis of edits to Harry Potter in Wikipedia, start to finish
    • We'll focus on manipulating data in Python
    • Visualizing things in Google Docs
  • Lunch (Tofu Bánh mì!)
  • Project based work
    • Data.seattle.gov
    • Your own projects!
    • Review Cafe
  • Wrap-up!

Lecture outline

Step 1: Pre-Requisites

  • My philosophy about data analysis: use the tools you have
  • Four things in Python I have to teach you now and one more thing later):
    • while loops
      • infinite loops
      • loops with a greater than or less than
    • break / continue
    • "\t".join()
    • defining your own functions with def foo(argument): and return bar
    • The .update() function that is associated with dictionaries.
  • opening and writing to a file using open()

Step 2: Walking through a Program

  • Walk-through of get_hpwp_dataset.py
  • Look at dataset with more and/or in spreadsheet

Step 3: Loading Data Back In

  • Load data into Python
    • review of opening files
      • we can also open them for reading with open('file', 'r', encoding="utf-8")
    • csv.DictReader()
  • Basic counting: hpwp-minor.py
    • Answer question: What proportion of edits to Wikipedia Harry Potter articles are minor?
      • Count the number of minor edits and calculate proportion
  • Looking at time series data hpwp-trend.py
    • "Bin" data by day to generate the trend line
  • Exporting and visualizing data
    • Export dataset on edits over time
    • Export dataset on articles over users
    • Load data into Google Docs

Older Resources