Wikipedia (CDSW): Difference between revisions

From CommunityData
m (Benjamin Mako Hill moved page Sample Wikipedia API questions to Wikipedia CDSW Project)
No edit summary
Line 1: Line 1:
[[File:Wikipedia.png|right|250px]]
__NOTOC__
== Building a Dataset using the Wikipedia API ==
In this project, we will explore a few ways to gather data using the Wikipedia API. Once we've done that, we will extend this to code to create our own datasets of Wikipedia edits or other data that we might be able to use to ask and answer questions in future sessions.
=== Goals ===
* Get set up to build datasets with the Wikipedia API
* Have fun collecting different types of data from Wikipedia
* Practice reading and extending other people's code
* Create a few collections of different types of data from Wikipedia that you can do research with in the final section
=== Download and test the Wikipedia project ===
# Right click the following file, click "Save Target as..." or "Save link as...", and save it to your Desktop directory: http://mako.cc/teaching/2015/community_data_science/wikipedia-data-examples.zip
# The ".zip" extension on the above file indicates that it is a compressed Zip archive. We need to "extract" its contents.
# Start up your terminal, navigate to the new directory you have unpacked called <code>wikipedia-data-examples.zip</code>, and then test the code by running:
python wikipedia1-1.py
=== Important Background ===
* Wikipedia runs software called MediaWiki which exists on many other wikis
* You can go to the [http://en.wikipedia.org/w/api.php api page] on any wiki to show documentation or check out the [https://www.mediawiki.org/wiki/API:Main_page main documentation for the MediaWiki API on the MedaiWiki website].
<!--
=== Material We Will Cover ===
* looking at the images within a page http://en.wikipedia.org/w/api.php?action=query&titles=Seattle&prop=images&imlimit=20&format=jsonfm
* change the city with a custom URL
* edit count http://en.wikipedia.org/w/api.php?action=query&list=users&ususers=Benjamin_Mako_Hill|Jtmorgan|Sj|Mindspillage&usprop=editcount&format=jsonfm
* get the content of the main page http://en.wikipedia.org/w/api.php?format=json&action=query&titles=Main%20Page&prop=revisions&rvprop=content
* example programs: [http://mako.cc/teaching/2014/cdsw-autumn/wikipedia-raw1-unicode-problems-example.py wikipedia-raw1-unicode-problems-example.py] (note: this is an example of Unicode problems when running this on Windows), [http://mako.cc/teaching/2014/cdsw-autumn/wikipedia-raw2-mudslide-edit.py wikipedia-raw2-mudslide-edit.py]-->
=== Questions to answer ===  
=== Questions to answer ===  
<ol>
<ol>
<li> Warm Up
<li> Warm Up
Line 31: Line 69:
# What day did it have the most views, and how many views did it have?
# What day did it have the most views, and how many views did it have?
# Who’s the most person named in the Panama papers with the most views to their Wikipedia page?
# Who’s the most person named in the Panama papers with the most views to their Wikipedia page?
=== Resources ===
* [https://www.mediawiki.org/wiki/API:Main_page Main MediaWiki API Documentation]
* [https://en.wikipedia.org/w/api.php Autogenerated API Documentation]
* [https://en.wikipedia.org/wiki/Special:ApiSandbox API Sandbox]
* [[Sample API queries]]

Revision as of 05:22, 23 April 2016

Wikipedia.png

Building a Dataset using the Wikipedia API

In this project, we will explore a few ways to gather data using the Wikipedia API. Once we've done that, we will extend this to code to create our own datasets of Wikipedia edits or other data that we might be able to use to ask and answer questions in future sessions.

Goals

  • Get set up to build datasets with the Wikipedia API
  • Have fun collecting different types of data from Wikipedia
  • Practice reading and extending other people's code
  • Create a few collections of different types of data from Wikipedia that you can do research with in the final section

Download and test the Wikipedia project

  1. Right click the following file, click "Save Target as..." or "Save link as...", and save it to your Desktop directory: http://mako.cc/teaching/2015/community_data_science/wikipedia-data-examples.zip
  2. The ".zip" extension on the above file indicates that it is a compressed Zip archive. We need to "extract" its contents.
  3. Start up your terminal, navigate to the new directory you have unpacked called wikipedia-data-examples.zip, and then test the code by running:
python wikipedia1-1.py

Important Background



Questions to answer

  1. Warm Up
    1. When was the article about the Panama Papers created?
    2. When was the most recent edit to the Panama Papers article?
    3. Think of one or two articles that interest you. Which ones were created first?
    4. Which have been edited most recently?
  2. How many views did Panama_Papers have…
    1. the day it was created?
    2. the first week?
    3. How does this compare to the articles that interest you?
  3. How many edits did it get in…
    1. the first 24 hours?
    2. the first week?
    3. How many edits did the articles that interest you get?

More difficult questions

  1. Who made the total most edits to the article?
  2. What’s the number of edits per day in the first two weeks of the article?
  3. What’s the peak number of edits per hour of the article? When did it occur?
  4. Who were the top editors during that hour?
  5. What day did it have the most views, and how many views did it have?
  6. Who’s the most person named in the Panama papers with the most views to their Wikipedia page?

Resources