Harry Potter on Wikipedia

From CommunityData
Wikipedia.png

Building a Dataset using the Wikipedia API

In this project, we will build off the work in the lecture to begin to analyze data from Wikipedia. Once we've done that, we will extend this to code to create our own sub-datasets of Wikipedia edits or other data that we might be able to use to ask and answer questions!

Download and test the HPWP project

  1. Right click the following file, click "Save Target as..." or "Save link as...", and save it to your Desktop directory: http://mako.cc/teaching/2015/community_data_science/harrypotter-wikipedia-cdsw.zip (original version used in class)
  2. The ".zip" extension on the above file indicates that it is a compressed Zip archive. We need to "extract" its contents.
  3. Start up your terminal, navigate to the new directory you have unpacked called harrypotter-wikipedia-cdsw.zip.

Download a Dataset

There are two ways to download a dataset. You can either:

  1. Run the program build_hpwp_dataset.py which will download the code from the Wikipedia API. This will take 10 minutes or so.
  2. You can download a "pre-made" version I have run on my computer by doing the right-click, "Save link as..." approach for this URL: http://communitydata.cc/~mako/hp_wiki.tsv
(original CSV version used in class)

Test Code

Once you have downloaded both code and the dataset, you can test it by running:

python hpwp-minor.py

This should output three lines and three numbers.

Example programs

hpwp-minor.py
This program aims to answer the question: What proportion of edits to Wikipedia Harry Potter articles are minor?
hpwp-trend.py
This program builds time series data by "binning" data by day to generate the trend line.