This page is a work in progress.
In this project, we will explore a few ways to gather data using two Wikipedia APIs: one provides data related to edits, and the other provides data related to pageviews. Once we've done that, we will extend this to code to create our own datasets of Wikipedia edits or other data that we might be able to use to ask and answer questions in the final session.
Goals
- Get set up to build datasets with Wikipedia APIs
- Have fun collecting different types of data from Wikipedia
- Practice reading API documentation
- Pracice testing API queries in an API Sandbox
- Practice reading and extending other people's code
Download and test the Wikipedia API project
- Right click the following file, click "Save Target as..." or "Save link as...", and save it to your Desktop directory:
https://jtmorgan.net/ds4ux/week6/panama-papers.zip
- Find
panama-papers.zip
on your Desktop and double-click on it to "unzip" it. That will create a folder calledpanama-papers
containing several files. - In PowerShell or Terminal, navigate to the
panama-papers
directory and type:
Datasources
Wikipedia Edit API
- the API sandbox - a tool for building queries
Wikipedia Page View API
- Use the experimental API
- This API is a little different because it uses relative paths instead of parameters.
- This API is case-sensitive.
- Request for Panama Papers: https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wikipedia/all-access/user/Panama_Papers/daily/20160401/20160420
Exercises
Building queries in the API Sandbox
- Using the Wikipedia edit API sandbox...
- When was the most recent edit to the Panama Papers article?
- When was the article about the Panama Papers created?
- User the Wikipedia page view API sandbox...
- How many views did Panama Papers have on the day it was created?
- How many views did Panama Papers have yesterday?
Building queries with Python requests
- How many views did Panama Papers have in the first week?
- How many edits did Panama Papers get in the first 24 hours?