Harry Potter on Wikipedia: Difference between revisions

From CommunityData
(Created page with "* Basic counting ** Answer question: ''What proportion of edits to Wikipedia Harry Potter articles are minor?'' *** Count the number of minor edits and calculate proportion *...")
 
No edit summary
 
(10 intermediate revisions by 3 users not shown)
Line 1: Line 1:
* Basic counting
[[File:Wikipedia.png|right|250px]]
** Answer question: ''What proportion of edits to Wikipedia Harry Potter articles are minor?''
__NOTOC__
*** Count the number of minor edits and calculate proportion
== Building a Dataset using the Wikipedia API ==
* Looking at time series data
** "Bin" data by day to generate the trend line
* Exporting and visualizing data
** Export dataset on edits over time
** Export dataset on articles over users
** Load data into Google Docs


We mostly worked on these questions in the afternoon:
In this project, we will build off the work in the lecture to begin to analyze data from Wikipedia. Once we've done that, we will extend this to code to create our own sub-datasets of Wikipedia edits or other data that we might be able to use to ask and answer questions!


* More advanced counting
=== Download and test the HPWP project ===
** Answer question: ''What are the most edited articles on Harry Potter?''
 
*** Count the number of edits per articles
# Right click the following file, click "Save Target as..." or "Save link as...", and save it to your Desktop directory: http://mako.cc/teaching/2015/cdsw-autumn/harrypotter-wikipedia-cdsw.zip
** Answer question: ''Who are the most active editors on articles in Harry Potter?''
# The ".zip" extension on the above file indicates that it is a compressed Zip archive. We need to "extract" its contents.
*** Count the number of edits per user
# Start up your terminal, navigate to the new directory you have unpacked called <code>harrypotter-wikipedia-cdsw.zip</code>.
 
=== Download a Dataset ===
 
There are two ways to download a dataset. You can either:
 
# Run the program <code>build_hpwp_dataset.py</code> which will download the code from the Wikipedia API. This will take 10 minutes or so.
# You can download a "pre-made" version I have run on my computer by doing the right-click, "Save link as..." approach for this URL: http://communitydata.cc/~mako/hp_wiki.tsv
 
=== Test Code ===
 
Once you have downloaded both code and the dataset, you can test it by running:
 
python hpwp-minor.py
 
This should output three lines and three numbers.
 
== Example programs ==
 
;<code>hpwp-minor.py</code>: This program aims to answer the question: ''What proportion of edits to Wikipedia Harry Potter articles are minor?''
 
;<code>hpwp-trend.py</code>: This program builds time series data by "binning" data by day to generate the trend line.
 
=== Coding Challenges ===
 
Coding Challenges
 
# Who are the 5 most active editors to articles in Wikipedia in Harry Potter? How may edits have they made?
# What are the most edited articles on Harry Potter on Wikipedia?
# Create graphs in a spreadsheet of the trend lines (i.e., edits per day over time) for the most active editor? How about one graph with the three most active editors?
# Create graphs in a spreadsheet of the trend lines (i.e., edits per day over time) for the three most popular articles?
# Instead of "binning" your dataset by day, change to bin it by month for each of the two previous questions.
# Pick a different topic in Wikipedia and download a new dataset. Answer the questions above for this other dataset.
 
[[Category:CDSW]]

Latest revision as of 02:58, 2 November 2015

Wikipedia.png

Building a Dataset using the Wikipedia API[edit]

In this project, we will build off the work in the lecture to begin to analyze data from Wikipedia. Once we've done that, we will extend this to code to create our own sub-datasets of Wikipedia edits or other data that we might be able to use to ask and answer questions!

Download and test the HPWP project[edit]

  1. Right click the following file, click "Save Target as..." or "Save link as...", and save it to your Desktop directory: http://mako.cc/teaching/2015/cdsw-autumn/harrypotter-wikipedia-cdsw.zip
  2. The ".zip" extension on the above file indicates that it is a compressed Zip archive. We need to "extract" its contents.
  3. Start up your terminal, navigate to the new directory you have unpacked called harrypotter-wikipedia-cdsw.zip.

Download a Dataset[edit]

There are two ways to download a dataset. You can either:

  1. Run the program build_hpwp_dataset.py which will download the code from the Wikipedia API. This will take 10 minutes or so.
  2. You can download a "pre-made" version I have run on my computer by doing the right-click, "Save link as..." approach for this URL: http://communitydata.cc/~mako/hp_wiki.tsv

Test Code[edit]

Once you have downloaded both code and the dataset, you can test it by running:

python hpwp-minor.py

This should output three lines and three numbers.

Example programs[edit]

hpwp-minor.py
This program aims to answer the question: What proportion of edits to Wikipedia Harry Potter articles are minor?
hpwp-trend.py
This program builds time series data by "binning" data by day to generate the trend line.

Coding Challenges[edit]

Coding Challenges

  1. Who are the 5 most active editors to articles in Wikipedia in Harry Potter? How may edits have they made?
  2. What are the most edited articles on Harry Potter on Wikipedia?
  3. Create graphs in a spreadsheet of the trend lines (i.e., edits per day over time) for the most active editor? How about one graph with the three most active editors?
  4. Create graphs in a spreadsheet of the trend lines (i.e., edits per day over time) for the three most popular articles?
  5. Instead of "binning" your dataset by day, change to bin it by month for each of the two previous questions.
  6. Pick a different topic in Wikipedia and download a new dataset. Answer the questions above for this other dataset.