DS4UX (Spring 2016)/Panama Papers

From CommunityData
This page is a work in progress.

In this project, we will explore a few ways to gather data using two Wikipedia APIs: one provides data related to edits, and the other provides data related to pageviews. Once we've done that, we will extend this to code to create our own datasets of Wikipedia edits or other data that we might be able to use to ask and answer questions in the final session.

Panama Papers

Goals

  • Get set up to build datasets with Wikipedia APIs
  • Have fun collecting different types of data from Wikipedia
  • Practice reading API documentation
  • Pracice testing API queries in an API Sandbox
  • Practice reading and extending other people's code

Download and test the Wikipedia API project

  1. Right click the following file, click "Save Target as..." or "Save link as...", and save it to your Desktop directory:

https://jtmorgan.net/ds4ux/week6/panama-papers.zip

  1. Find panama-papers.zip on your Desktop and double-click on it to "unzip" it. That will create a folder called panama-papers containing several files.
  2. In PowerShell or Terminal, navigate to the panama-papers directory and type:


Datasources

Wikipedia Edit API


Wikipedia Page View API

Exercises

Building queries in the API Sandbox

Using the Wikipedia edit API sandbox...

  1. When was the article about the Panama Papers created?
  2. When was the most recent edit to the Panama Papers article?

How many views did Panama_Papers have…

  1. the day it was created?
  2. the first week?

Building queries with Python requests

How many edits did it get in…

  1. the first 24 hours?
  2. the first week?
  • Think of one or two articles that interest you. Which ones were created first?
  • Which have been edited most recently?
  • How does this compare to the articles that interest you?
  • How many edits did the articles that interest you get?
  • Resources