Community Data Science Workshops (Spring 2015)/Day 3 Lecture

From CommunityData
< Community Data Science Workshops (Spring 2015)
Revision as of 00:13, 17 September 2015 by Jtmorgan (talk | contribs) (cat)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Material for the lecture[edit]

For the lecture, you will need two files. Download both of these to your computer by using right or control click on the link and then using Save as or Save link as. Keep track of where you put the files.

Overview of the day[edit]

  • Lecture
    • Our philosophy around data visualization
    • Introduce some new programming tools!
    • We're going to walk through some analysis of edits to Harry Potter in Wikipedia, start to finish
    • We'll focus on manipulating data in Python
    • Visualizing things in Google Docs
  • Lunch (vegetarian Greek!)
  • Project based work
    • More Harry Potter on Wikipedia project (or your own topic) on doing analysis using Google Docs
    • Matplotlib
    • Civic Data - More interactive working on projects
    • Room for you to to work on your projects!
  • Wrap-up!

Lecture outline[edit]

Step 1: Pre-Requisites

  • My philosophy about data analysis: use the tools you have
  • Four things in Python I have to teach you:
    • while loops
      • infinite loops
      • loops with a greater than or less than
    • break / continue
    • "\t".join()
    • defining your own functions with def foo(argument):

Step 2: Walking through a Program

  • Walk-through of get_hpwp_dataset.py
  • Look at dataset with more and/or in spreadsheet

Step 3: Loading Data Back In

  • Load data into Python
    • review of opening files
      • we can also open them for reading with open('file', 'r', encoding="utf-8")
    • csv.DictReader()
  • Basic counting: hpwp-minor.py
    • Answer question: What proportion of edits to Wikipedia Harry Potter articles are minor?
      • Count the number of minor edits and calculate proportion
  • Looking at time series data hpwp-trend.py
    • "Bin" data by day to generate the trend line
  • Exporting and visualizing data
    • Export dataset on edits over time
    • Export dataset on articles over users
    • Load data into Google Docs