DS4UX (Spring 2016)/Panama Papers: Difference between revisions
Line 122: | Line 122: | ||
;3. How many times was Panama Papers viewed in the first week? What proportion of those views came from mobile devices? | ;3. How many times was Panama Papers viewed in the first week? What proportion of those views came from mobile devices? | ||
Completing this exercise also requires two API requests: one to gather pageview data for ALL devices, and then | |||
;4. How many other articles has [https://en.wikipedia.org/wiki/User_talk:Czar User:Czar] edited on Wikipedia since they created Panama Papers? | ;4. How many other articles has [https://en.wikipedia.org/wiki/User_talk:Czar User:Czar] edited on Wikipedia since they created Panama Papers? |
Revision as of 21:49, 2 May 2016
In this project, we will explore a few ways to gather data using two Wikipedia APIs: one provides data related to edits, and the other provides data related to pageviews. Once we've done that, we will extend this to code to create our own datasets of Wikipedia edits or other data that you use as the basis of your Final Project.
This project is adapted from material being developed for the Community Data Science Workshops by Ben Lewis and Morten Wang.
Overview
In this project we will look at the viewing and editing history of a recently created Wikipedia article about a breaking news event— Panama Papers. When events of global significance occur, Wikipedia is often among the first places that people look for information about these events. By examining both the editing and viewing history of this article, we can learn a lot about how people create and consume information on Wikipedia.
Goals
- Understand the basic anatomy of an API request
- Have fun collecting different types of data from Wikipedia
- Practice reading API documentation
- Pracice testing API queries in an API Sandbox
- Practice reading and extending other people's code
Download and test the Panama Papers project
- Right click the following file, click "Save Target as..." or "Save link as...", and save it to your Desktop directory: https://jtmorgan.net/ds4ux/week6/panama-papers.zip
- Find
panama-papers.zip
on your Desktop and double-click on it to "unzip" it. That will create a folder calledpanama-papers
containing several files. - In PowerShell or Terminal, navigate to the
panama-papers
directory and type:
python wikipedia-1.py
at the command prompt to execute the wikipedia-1.py
Python script. Wait a little while while your computer connects to Wikipedia. You should see the result of a query from Wikipedia API on your screen. If you don't, let Jonathan or Ray know .
About the APIs
- Wikipedia Edit API
- An API that provides data on edits to Wikipedia pages. Use the API sandbox to build and test queries with this API.
- Also known as "the MediaWiki API", because it is the standard API for all sites that use MediaWiki.
- You can use this API to edit Wikipedia, and do all sorts of other things. We will just be using the Query module of the API to gather data about individual pages and editors.
- Example request for Panama Papers (current content of the page): https://en.wikipedia.org/w/api.php?format=json&action=query&titles=Panama_Papers&prop=revisions&rvprop=content
- Wikipedia Page View API
- Use the (experimental!) Wikipedia page view API to find out how many views a given Wikipedia article gets on a daily basis.
- This API is a little different than the Wikipedia edit API because it uses relative paths (
/.../.../
) instead of parameters (somekey=someval&otherkey=otherval
). - This API is case-sensitive, and space-sensitive (replace spaces in page titles with underscores). It also requires you to URL encode any special characters (like !, $, # and %) that might appear in a page title.
- Example request for Panama Papers (page views 4/01 - 4/20): https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wikipedia/all-access/user/Panama_Papers/daily/20160401/20160420
Exercises
Building queries in the Wikipedia editing API Sandbox
- When was the article about the Panama Papers created?
- When was the most recent edit to the Panama Papers article?
- How many edits has the creator of Panama Papers made to Wikipedia?
- What was the text of the Panama Papers article 24 hours after it was created?
- Who has edited Panama Papers?
- What categories is Panama Papers in?
- What other articles are in the Category Panama Papers?
- What other articles does Panama Papers link to?
Building queries in the Wikipedia page view API Sandbox
- How many views did Panama Papers have during its first week?
- How many views did Panama papers have during its first week from mobile devices?
Building queries with Python requests
Now that we're comfortable building API queries in the sandbox, we will focus on how we can access these APIs with Python. If you would like to review the steps involved in building an API query in Python, check out the resources listed below.
- Querying APIs from Python a written lecture by Ben Lewis that walks you step-by-step through the process of building and executing an API query in Python. The 'companion script'
building_a_query_code.py
in the project directory executes all of the code shown in this lecture step-by-step. If you want to just execute some of the code in the lecture, comment out all the stuff below the blocks of code you want to execute it before you run the script. wikipedia-1.py
— This is the script you were asked to execute to 'test' your code when you downloaded the project. It's also a valid API request that gathers metadata about the first revision to Panama Papers, and prints it to your terminal. The JSON that this query returns can be seen here: https://en.wikipedia.org/w/api.php?action=query&prop=revisions&titles=Panama_Papers&rvdir=newer&rvlimit=1&format=jsonfmintroduce_while.py
— (in project directory) this script uses a while loop to roll two 'virtual dice' until they both come up 6's. This example doesn't make API calls or use Wikipedia data—the point is to help you understand that sometimes you will need to loop through an operation (like an API request) an indeterminate number of times. In these situations, a 'while' loop is more appropriate than a 'for' loop.introduce_continue.py
— (in project directory) this script shows you two ways to use the value of the 'continue' key that is embedded inside the JSON returned by your API request. Each API request returns a chunk of data, but there may be more data available! By passing the value of 'continue' back in subsequent requests, you can pick up where the last request left off.
- 1. How many edits did Panama Papers receive in its first 24 hours?
Completing this exercise requires you to first query the Wikipedia API to get revisions to the article that have a timestamp between 2016/4/3 17:59:05 and 2016/4/4 17:59:05, and then use Python to count the number of revisions in the resulting JSON.
- 2. How many edits has Panama Papers receive from mobile devices since it was created?
Completing this exercise requires you to perform two queries with the Wikipedia pageview API, because there are two types of mobile device counts—mobile-app
and mobile-web
—and you can only query them one at a time.
- 3. How many times was Panama Papers viewed in the first week? What proportion of those views came from mobile devices?
Completing this exercise also requires two API requests: one to gather pageview data for ALL devices, and then
- 4. How many other articles has User:Czar edited on Wikipedia since they created Panama Papers?
The sample query above returns a list of edits by Czar between 4/03 and 5/01, that includes the title of each article that was edited. How would you iterate over this JSON data in Python to come up with a list that contained the title of every article Czar edited, with no duplicates?
Coding challenges
Once you have worked through these exercises and feel confident that you can gather data from the Wikipedia edit and pageview APIs using Python, you can get started on this week's coding challenges!
Resources
- API documentation for the query module
- API Sandbox
- Sample Wikipedia API queries
- The session lecture notes (in Markdown) and python sources.
Research using Wikipedia data
- HistoryFlow — A colorful visualization of the development of Wikipedia articles over time.
- ‘Is’ to ‘Was’: Coordination and Commemoration on Posthumous Wikipedia Biographies — an exploration of editing patterns around Wikipedia articles about people who have recently died.
- WikiWorthy: Judging a Candidate’s Notability in the Community — A study that uses the editing activity on Wikipedia articles about political candidates as a predictor of election success.
Websites that use the MediaWiki API
- Listen to Wikipedia — a dynamic, audiovisual experience based on what is being edited on Wikipedia right now.
- HistoryGraph — an interactive timeline of world events based on Wikipedia articles.
- Google's knowledge graph