Community Data Science Workshops (Fall 2015)/Day 3 Lecture
From CommunityData
Material for the lecture
For the lecture, you will need two files. Download both of these to your computer by using right or control click on the link and then using Save as or Save link as. Keep track of where you put the files.
- http://mako.cc/teaching/2015/cdsw-spring/harrypotter-wikipedia-cdsw.zip
- http://communitydata.cc/~mako/hp_wiki.tsv
Overview of the day
- Lecture
- Our philosophy around data visualization
- Introduce some new programming tools!
- We're going to walk through some analysis of edits to Harry Potter in Wikipedia, start to finish
- We'll focus on manipulating data in Python
- Visualizing things in Google Docs
- Lunch (vegetarian Greek!)
- Project based work
- More Harry Potter on Wikipedia project (or your own topic) on doing analysis using Google Docs
- Matplotlib
- Civic Data - More interactive working on projects
- Room for you to to work on your projects!
- Wrap-up!
Lecture outline
Step 1: Pre-Requisites
- My philosophy about data analysis: use the tools you have
- Four things in Python I have to teach you:
- while loops
- infinite loops
- loops with a greater than or less than
- break / continue
- "\t".join()
- defining your own functions with
def foo(argument):
- while loops
Step 2: Walking through a Program
- Walk-through of
get_hpwp_dataset.py
- Look at dataset with
more
and/or in spreadsheet
Step 3: Loading Data Back In
- Load data into Python
- review of opening files
- we can also open them for reading with
open('file', 'r', encoding="utf-8")
- we can also open them for reading with
- csv.DictReader()
- review of opening files
- Basic counting:
hpwp-minor.py
- Answer question: What proportion of edits to Wikipedia Harry Potter articles are minor?
- Count the number of minor edits and calculate proportion
- Answer question: What proportion of edits to Wikipedia Harry Potter articles are minor?
- Looking at time series data
hpwp-trend.py
- "Bin" data by day to generate the trend line
- Exporting and visualizing data
- Export dataset on edits over time
- Export dataset on articles over users
- Load data into Google Docs