DS4UX (Spring 2016)/Panama Papers: Difference between revisions

From CommunityData
No edit summary
No edit summary
Line 43: Line 43:
-->
-->


== Datasources ==
== Panama Papers data sources ==


=== Wikipedia Edit API ===
;Wikipedia Edit API: An API that provides data on edits to Wikipedia pages. Use the [https://en.wikipedia.org/wiki/Special:ApiSandbox API sandbox] to build and test queries with this API.
* the [https://en.wikipedia.org/wiki/Special:ApiSandbox API sandbox] - a tool for building queries




=== Wikipedia Page View API ===
;Wikipedia Page View API: Use [https://wikimedia.org/api/rest_v1/?doc#!/Pageviews_data the (experimental!) Wikipedia page view API] to find out how many views a given Wikipedia article gets on a daily basis.
* Use [https://wikimedia.org/api/rest_v1/?doc#!/Pageviews_data the (experimental!) Wikipedia page view API] to find out how many views a given Wikipedia article gets on a daily basis.
 
* Note: This API is a little different than the Wikipedia edit API because it uses relative paths (<code>/.../.../</code>) instead of parameters (<code>somekey=someval&otherkey=otherval</code>).
* Note: This API is a little different than the Wikipedia edit API because it uses relative paths (<code>/.../.../</code>) instead of parameters (<code>somekey=someval&otherkey=otherval</code>).
* This API is case-sensitive, and space-sensitive (replace spaces in page titles with underscores). It also requires you to [http://www.w3schools.com/tags/ref_urlencode.asp URL encode] any special characters (like !, $, # and %) that might appear in a page title.
* This API is case-sensitive, and space-sensitive (replace spaces in page titles with underscores). It also requires you to [http://www.w3schools.com/tags/ref_urlencode.asp URL encode] any special characters (like !, $, # and %) that might appear in a page title.
* Example request for Panama Papers: https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wikipedia/all-access/user/Panama_Papers/daily/20160401/20160420
* Example request for Panama Papers: https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wikipedia/all-access/user/Panama_Papers/daily/20160401/20160420


== Exercises ==
== Exercises ==

Revision as of 05:08, 2 May 2016

This page is a work in progress.
Wikipedia.png

In this project, we will explore a few ways to gather data using two Wikipedia APIs: one provides data related to edits, and the other provides data related to pageviews. Once we've done that, we will extend this to code to create our own datasets of Wikipedia edits or other data that we might be able to use to ask and answer questions in the final session.

Panama Papers

Goals

  • Understand the basic anatomy of an API request
  • Have fun collecting different types of data from Wikipedia
  • Practice reading API documentation
  • Pracice testing API queries in an API Sandbox
  • Practice reading and extending other people's code

Download and test the Wikipedia API project

Panama Papers data sources

Wikipedia Edit API
An API that provides data on edits to Wikipedia pages. Use the API sandbox to build and test queries with this API.


Wikipedia Page View API
Use the (experimental!) Wikipedia page view API to find out how many views a given Wikipedia article gets on a daily basis.


Exercises

Building queries in the Wikipedia editing API Sandbox

When was the article about the Panama Papers created?


When was the most recent edit to the Panama Papers article?


How many edits has the creator of Panama Papers made to Wikipedia?


What was the text of the Panama Papers article 24 hours after it was created?


Who has edited Panama Papers?


What categories is Panama Papers in?


What other articles are in the Category Panama Papers?


What other articles does Panama Papers link to?



Building queries in the Wikipedia page view API Sandbox

How many views did Panama Papers have during its first week?
How many views did Panama papers have during its first week from mobile devices?


Building queries with Python requests

1. How many edits did Panama Papers receive in its first 24 hours?

Completing this exercise requires you to firist query the Wikipedia API to get revisions to the article that have a timestamp between 2016/4/3 17:59:05 and 2016/4/4 17:59:05, and then use Python to count the number of revisions in the resulting JSON.


2. How many edits has Panama Papers receive from mobile devices since it was created?

Completing this exercise requires you to perform two queries with the Wikipedia pageview API, because there are two types of mobile device counts—mobile-app and mobile-web—and you can only query them one at a time.


3. How many times was Panama Papers viewed in the first week? What proportion of those views came from mobile devices?


4. How many other articles has User:Czar edited on Wikipedia since they created Panama Papers?

The sample query above returns a list of edits by Czar between 4/03 and 5/01, that includes the title of each article that was edited. How would you iterate over this JSON data in Python to come up with a list that contained the title of every article Czar edited, with no duplicates?

Coding challenges

Once you have worked through these exercises and feel confident that you can gather data from the Wikipedia edit and pageview APIs using Python, you can get started on this week's coding challenges!

Resources