Editing DS4UX (Spring 2016)/Panama Papers

From CommunityData
Warning: You are not logged in. Your IP address will be publicly visible if you make any edits. If you log in or create an account, your edits will be attributed to your username, along with other benefits.

The edit can be undone. Please check the comparison below to verify that this is what you want to do, and then publish the changes below to finish undoing the edit.

Latest revision Your text
Line 1: Line 1:
[[File:Wikipedia.png|right|250px]]
[[File:Wikipedia.png|right|250px]]


In this project, we will explore a few ways to gather data using two Wikipedia APIs: one provides data related to edits, and the other provides data related to pageviews. Once we've done that, we will extend this to code to create our own datasets of Wikipedia edits or other data that you use as the basis of your Final Project.
In this project, we will explore a few ways to gather data using the Wikipedia API. Once we've done that, we will extend this to code to create our own datasets of Wikipedia edits or other data that we might be able to use to ask and answer questions in the final session.


This project is adapted from material being developed for the [[CDSW|Community Data Science Workshops]] by Ben Lewis and Morten Wang ([https://github.com/nettrom/wikipedia-session GitHub repo]).
== Goals ==


== Overview ==
* Get set up to build datasets with the Wikipedia API
In this project we will look at the viewing and editing history of a recently created Wikipedia article about a breaking news event—''[[w:Panama_Papers| Panama Papers]]''. When events of global significance occur, Wikipedia is often among the first places that people look for information about these events. By examining both the editing and viewing history of this article, we can learn a lot about how people create ''and'' consume information on Wikipedia.
 
The process by which 'breaking news' articles are created on Wikipedia is [http://dgergle.soc.northwestern.edu/resources/KeeganGergleContractor_StayingInTheLoop_WikiSym2012.pdf a fascinating area of research] for data scientists who are study how humans work together. For more links to interesting research on Wikipedia, see the [[DS4UX_(Spring_2016)/Wikipedia_API#Research_using_Wikipedia_data|Resources section]] of this page.
 
=== Goals ===
 
* Understand the basic anatomy of an API request
* Have fun collecting different types of data from Wikipedia
* Have fun collecting different types of data from Wikipedia
* Practice reading API documentation
* Pracice testing API queries in an [https://en.wikipedia.org/wiki/Special:ApiSandbox API Sandbox]
* Practice reading and extending other people's code
* Practice reading and extending other people's code
* Create a few collections of different types of data from Wikipedia that you can do research with in the final section


=== Download and test the Panama Papers project ===
== Download and test the Wikipedia API project ==
 
# Right click the following file, click "Save Target as..." or "Save link as...", and save it to your Desktop directory: https://jtmorgan.net/ds4ux/week6/panama-papers.zip
# Find <code>panama-papers.zip</code> on your Desktop and double-click on it to "unzip" it. That will create a folder called <code>panama-papers</code> containing several files.
# In PowerShell or Terminal, navigate to the <code>panama-papers</code> directory and type:


python wikipedia-1.py
If you are confused by these steps, go back and refresh your memory with the [[Community Data Science Workshops (Fall 2015)/Day 0 setup and tutorial|Day 0 setup and tutorial]] and [[Community Data Science Workshops (Fall 2015)/Day 0 tutorial|Day 0 tutorial]]


at the command prompt to execute the <code>wikipedia-1.py</code> Python script. Wait a little while while your computer connects to Wikipedia. You should see the result of a query from Wikipedia API on your screen. If you don't, let Jonathan or Ray know .
(Estimated time: 10 minutes)


=== About the APIs ===
===Download the Wikipedia API project===


;Wikipedia Edit API: An API that provides data on edits to Wikipedia pages. Use the [https://en.wikipedia.org/wiki/Special:ApiSandbox API sandbox] to build and test queries with this API.
* Right click the following file, click "Save Target as..." or "Save link as...", and save it to your Desktop directory:  
* Also known as "the MediaWiki API", because it is the standard API for all sites that use [https://en.wikipedia.org/wiki/MediaWiki MediaWiki].  
https://communitydata.cc/~groceryheist/wikipedia-cdsw-answers.zip
* You can use this API to edit Wikipedia, and do all sorts of other things. We will just be using [https://www.mediawiki.org/wiki/API:Query the Query module of the API] to gather data about individual pages and editors.
* Example request for Panama Papers (current content of the page): https://en.wikipedia.org/w/api.php?format=json&action=query&titles=Panama_Papers&prop=revisions&rvprop=content


* The ".zip" extension on the above file indicates that it is a compressed Zip archive. We need to "extract" its contents. To do this, click on "Start", then "Computer" on Windows or open Finder and navigate to your Desktop directory if you are a Mac. Find <code>wikipedia-cdsw.zip</code> on your Desktop and double-click on it to "unzip" it. That will create a folder called <code>wikipedia-cdsw</code> containing several files.


;Wikipedia Page View API: Use [https://wikimedia.org/api/rest_v1/?doc#!/Pageviews_data the (experimental!) Wikipedia page view API] to find out how many views a given Wikipedia article gets on a daily basis.
===Test the Wikipedia API code===
* This API is a little different than the Wikipedia edit API because it uses relative paths (<code>/.../.../</code>) instead of parameters (<code>somekey=someval&otherkey=otherval</code>).
* This API is case-sensitive, and space-sensitive (replace spaces in page titles with underscores). It also requires you to [http://www.w3schools.com/tags/ref_urlencode.asp URL encode] any special characters (like !, $, # and %) that might appear in a page title.
* Example request for Panama Papers (page views 4/01 - 4/20): https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wikipedia/all-access/user/Panama_Papers/daily/20160401/20160420


== Exercises ==
<div style="background-color:#CEE7DA; width:80%; padding:1.2em;">
=== Building queries in the Wikipedia editing API Sandbox ===
'''On Windows'''


;When was the article about the Panama Papers created?
Start up PowerShell and navigate to the <code>Desktop\wikipedia-cdsw</code> directory where the Wikipedia API code lives. For example, if the Wikipedia API project is at <code>C:\Users\'''YOURUSERNAME'''\Desktop\wikipedia-cdsw</code>,


* [https://en.wikipedia.org/wiki/Special:ApiSandbox#action=query&format=json&prop=revisions&list=&meta=&titles=Panama_Papers&rvprop=ids%7Ctimestamp%7Cflags%7Ccomment%7Cuser&rvlimit=1&rvdir=newer View query in sandbox]
cd C:\Users\'''YOURUSERNAME'''\Desktop\wikipedia-cdsw
* [https://en.wikipedia.org/w/api.php?action=query&format=json&prop=revisions&list=&meta=&titles=Panama_Papers&rvprop=ids%7Ctimestamp%7Cflags%7Ccomment%7Cuser&rvlimit=1&rvdir=newer View result in browser]
* [https://www.mediawiki.org/wiki/API:Revisions View API:Revisions documentation]


</div>
<div style="background-color:#D8E8FF; width:80%; padding:1.2em;">
'''On Mac'''


;When was the most recent edit to the Panama Papers article?
Start a command prompt and navigate to the Desktop/wikipedia-data-examples directory where the Wikipedia API code lives. For example, if the Wikipedia API project is at <code>~/Desktop/wikipedia-cdsw</code>,


* [https://en.wikipedia.org/wiki/Special:ApiSandbox#action=query&format=json&prop=revisions&list=&meta=&titles=Panama_Papers&rvprop=ids%7Ctimestamp%7Cflags%7Ccomment%7Cuser&rvlimit=1&rvdir=older View query in sandbox]
cd ~/Desktop/wikipedia-cdsw
* [https://en.wikipedia.org/w/api.php?action=query&format=json&prop=revisions&list=&meta=&titles=Panama_Papers&rvprop=ids%7Ctimestamp%7Cflags%7Ccomment%7Cuser&rvlimit=1&rvdir=newer View result in browser]
* [https://www.mediawiki.org/wiki/API:Revisions View API:Revisions documentation]


</div>


; How many edits has the creator of Panama Papers made to Wikipedia?
This will change you into the Wikipedia example code directory. Running <code>ls</code> will show you the source code files in that directory. One of the files is <code>wikipedia-1.py</code>, which has a <code>.py</code> extension indicating that it is a Python script. Type:


* [https://en.wikipedia.org/wiki/Special:ApiSandbox#action=query&format=json&list=users&usprop=editcount%7Cregistration&ususers=Czar View query in sandbox]
python wikipedia-1.py
* [https://en.wikipedia.org/w/api.php?action=query&format=json&list=users&usprop=editcount%7Cregistration&ususers=Czar View result in browser]
* [https://www.mediawiki.org/wiki/API:Users View API:Users documentation]
 
 
;What was the text of the Panama Papers article 24 hours after it was created?
 
* [https://en.wikipedia.org/wiki/Special:ApiSandbox#action=query&format=json&prop=revisions&titles=Panama+Papers&rvprop=ids%7Ctimestamp%7Ccontent&rvstart=2016-04-04T17%3A58%3A00.000Z&rvend=2016-04-04T17%3A59%3A05.000Z&rvdir=newer View query in sandbox]
* [https://en.wikipedia.org/w/api.php?action=query&format=json&prop=revisions&titles=Panama+Papers&rvprop=ids%7Ctimestamp%7Ccontent&rvstart=2016-04-04T17%3A58%3A00.000Z&rvend=2016-04-04T17%3A59%3A05.000Z&rvdir=newer View result in browser]
* [https://www.mediawiki.org/wiki/API:Revisions View API:Revisions documentation]
* [https://en.wikipedia.org/w/index.php?title=Panama_Papers&oldid=713548359 View the text of this revision on Wikipedia]
* [https://en.wikipedia.org/w/index.php?title=Panama_Papers&diff=713548359&oldid=713548357 View the "diff" version of revision] (shows what was changed between this edit and the previous one)
 
 
;Who has edited Panama Papers?
* [https://www.mediawiki.org/wiki/API:Contributors View API:Contributors documentation]
* [https://en.wikipedia.org/wiki/Special:ApiSandbox#action=query&format=json&prop=contributors&list=&titles=Panama+Papers&pclimit=500 View query in sandbox]
 
 
;What categories is Panama Papers in?
* [https://www.mediawiki.org/wiki/API:Categories View API:Categories documentation]
* [https://en.wikipedia.org/wiki/Special:ApiSandbox#action=query&format=json&prop=categories&list=&titles=Panama+Papers&clprop=timestamp&cllimit=500 View query in API sandbox]
 
 
;What other articles are in the ''Category'' Panama Papers?
* [https://www.mediawiki.org/wiki/API:Categorymembers View API:Categorymembers documentation]
* [https://en.wikipedia.org/wiki/Special:ApiSandbox#action=query&format=json&prop=&list=categorymembers&titles=Panama+Papers&cmtitle=Category%3APanama_Papers&cmprop=ids%7Ctitle%7Ctimestamp&cmtype=page&cmlimit=50 View query in sandbox]
 
 
;What other articles does Panama Papers link to?
* [https://www.mediawiki.org/wiki/API:Links View API:Links documentation]
* [https://en.wikipedia.org/wiki/Special:ApiSandbox#action=query&format=json&prop=links&list=&titles=Panama+Papers&plnamespace=0&pllimit=500 View query in API sandbox]
 
 
<br/>
 
=== Building queries in the Wikipedia page view API Sandbox ===
;How many views did Panama Papers have during its first week?
 
*[https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wikipedia.org/all-access/all-agents/Panama_Papers/daily/20160403/20160409 View results in browser]
 
;How many views did Panama papers have during its first week from mobile devices?


* [https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wikipedia.org/mobile-web/all-agents/Panama_Papers/daily/20160403/20160409 View results for mobile website in browser]
at the command prompt to execute the <code>wikipedia-1.py</code> Python script. Wait a little while while your computer connects to Wikipedia. You should see the result of a query from Wikipedia API on your screen. If you don't, let a staff member know.
* [https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wikipedia.org/mobile-app/all-agents/Panama_Papers/daily/20160403/20160409 View results for mobile app in browser]
<br/>


=== Building queries with Python <code>requests</code> ===
== Example topics we might cover in the session ==
Now that we're comfortable building API queries in the sandbox, we will focus on how we can access these APIs with Python. If you would like to review the steps involved in building an API query in Python, check out the resources listed below.
* [https://github.com/makoshark/wikipedia-cdsw/blob/master/building-a-query.md Querying APIs from Python] a written lecture by Ben Lewis that walks you step-by-step through the process of building and executing an API query in Python. The 'companion script' <code>building_a_query_code.py</code> in the project directory executes all of the code shown in this lecture step-by-step. If you want to just execute some of the code in the lecture, comment out all the stuff below the blocks of code you want to execute it before you run the script.
* <code>wikipedia-1.py</code> — This is the script you were asked to execute to 'test' your code when you downloaded the project. It's also a valid API request that gathers metadata about the first revision to Panama Papers, and prints it to your terminal. The JSON that this query returns can be seen here: https://en.wikipedia.org/w/api.php?action=query&prop=revisions&titles=Panama_Papers&rvdir=newer&rvlimit=1&format=jsonfm
* <code>introduce_while.py</code> — (in project directory) this script uses a while loop to roll two 'virtual dice' until they both come up 6's. This example doesn't make API calls or use Wikipedia data—the point is to help you understand that sometimes you will need to loop through an operation (like an API request) an indeterminate number of times. In these situations, a 'while' loop is more appropriate than a 'for' loop.
* <code>introduce_continue.py</code> — (in project directory) this script shows you two ways to use the value of the 'continue' key that is embedded inside the JSON returned by your API request. Each API request returns a chunk of data, but there may be more data available! By passing the value of 'continue' back in subsequent requests, you can pick up where the last request left off.


=== Main Wikipedia API ===
* explain [http://www.mediawiki.org/wiki/API:Main_page MediaWiki], exists on other wikis
* navigate to [http://en.wikipedia.org/w/api.php api page] and show the documentation, point out examples
* introduce the [https://en.wikipedia.org/wiki/Special:ApiSandbox API sandbox] as a tool for building queries
* looking at the images within a page http://en.wikipedia.org/w/api.php?action=query&titles=Seattle&prop=images&imlimit=20&format=jsonfm
* change the city with a custom URL
* edit count http://en.wikipedia.org/w/api.php?action=query&list=users&ususers=Benjamin_Mako_Hill|Jtmorgan|Sj|Koavf&usprop=editcount&format=jsonfm
* get the content of the main page http://en.wikipedia.org/w/api.php?format=json&action=query&titles=Main%20Page&prop=revisions&rvprop=content


;1. How many edits did Panama Papers receive in its first 24 hours?
=== Page View API ===
* Use [https://wikimedia.org/api/rest_v1/?doc#!/Pageviews_data the experimental API]
* Explain that this API is a little different because it uses relative paths instead of parameters.
* Also note that this API is case-sensitive.
* Request for Panama Papers: https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wikipedia/all-access/user/Panama_Papers/daily/20160401/20160420


Completing this exercise requires you to first query the Wikipedia API to get revisions to the article that have a timestamp between 2016/4/3 17:59:05 and 2016/4/4 17:59:05, and then use Python to count the number of revisions in the resulting JSON.
== Questions to answer ==


Warm up questions:


;2. How many edits has Panama Papers receive from mobile devices since it was created?  
<ol start="1">
<li>When was the article about the Panama Papers created?</li>
<li>When was the most recent edit to the Panama Papers article?</li>
<li>Think of one or two articles that interest you. Which ones were created first?</li>
<li>Which have been edited most recently?</li>
</ol>


Completing this exercise requires you to perform two queries with the Wikipedia pageview API, because there are ''two'' types of mobile device counts—<code>mobile-app</code> and <code>mobile-web</code>—and you can only query them one at a time.
How many views did Panama_Papers have…


<ol start="5">
<li>the day it was created?</li>
<li>the first week? </li>
<li>How does this compare to the articles that interest you? </li>
</ol>


;3. How many times was Panama Papers viewed in the first week? What proportion of those views came from mobile devices?
How many edits did it get in…
<ol start="9">
<li> the first 24 hours?  </li>
<li> the first week? </li>
<li> How many edits did the articles that interest you get? </li>
</ol>


Completing this exercise also requires two API requests: one to gather pageview data for ALL devices, and then performing a request that only gathers data about devices that viewed the page using the [https://en.m.wikipedia.org/wiki/Main_Page Wikipedia mobile website].
More difficult questions:


 
<ol start="12">
;4. How many other articles has [https://en.wikipedia.org/wiki/User_talk:Czar User:Czar] edited on Wikipedia since they created Panama Papers?
<li>Who made the total most edits to the article?</li>
* [https://www.mediawiki.org/wiki/API:Usercontribs View documentation for API:Usercontribs]
<li>What’s the number of edits per day in the first two weeks of the article?</li>
* [https://en.wikipedia.org/wiki/Special:ApiSandbox#action=query&format=json&prop=&list=usercontribs&titles=&uclimit=500&ucstart=2016-04-04T17%3A59%3A00.000Z&ucend=2016-05-01T00%3A22%3A24.000Z&ucuser=Czar&ucdir=newer&ucnamespace=0&ucprop=ids%7Ctitle%7Ctimestamp%7Ccomment%7Csize View example query in API sandbox]
<li>What’s the peak number of edits per hour of the article? When did it occur?</li>
 
<li>Who were the top editors during that hour?</li>
The sample query above returns a list of edits by Czar between 4/03 and 5/01, that includes the ''title'' of each article that was edited. How would you iterate over this JSON data in Python to come up with a list that contained the title of every article Czar edited, with no duplicates?
<li>What day did it have the most views, and how many views did it have?</li>
<br/>
<li>How many views did it have per day?</li>
 
<li>How many views did it have per day on German Wikipedia?</li>
== Coding challenges ==
<li>Who’s the person named in the Panama papers with the most views to their Wikipedia page?</li>
 
</ol>
Once you have worked through these exercises and feel confident that you can gather data from the Wikipedia edit and pageview APIs using Python, you can get started on [[DS4UX_(Spring_2016)/Day_6_coding_challenge|this week's coding challenges]]!
<br/>


== Resources ==
== Resources ==
Line 148: Line 108:
* [https://en.wikipedia.org/w/api.php?action=help&modules=query API documentation for the query module]
* [https://en.wikipedia.org/w/api.php?action=help&modules=query API documentation for the query module]
* [https://en.wikipedia.org/wiki/Special:ApiSandbox API Sandbox]
* [https://en.wikipedia.org/wiki/Special:ApiSandbox API Sandbox]
* [[Sample Wikipedia API queries|More sample Wikipedia API queries]]
* [[Sample Wikipedia API queries]]
 
* [https://github.com/ben-zen/wikipedia-session The session lecture notes (in Markdown) and python sources.]
 
* [[Sample Wikipedia API questions]]
=== Research using Wikipedia data ===
* ''[https://www.research.ibm.com/visual/projects/history_flow/ HistoryFlow]'' — A colorful visualization of the development of Wikipedia articles over time.
* [http://www.brianckeegan.com/papers/CSCW_2015.pdf ‘Is’ to ‘Was’: Coordination and Commemoration on Posthumous Wikipedia Biographies] — an exploration of editing patterns around Wikipedia articles about people who have recently died.
* [http://www.brianckeegan.com/papers/ICS_2015.pdf WikiWorthy: Judging a Candidate’s Notability in the Community] — A study that uses the editing activity on Wikipedia articles about political candidates as a predictor of election success.
 
 
=== Websites that use the MediaWiki API ===
* [http://listen.hatnote.com/ Listen to Wikipedia] — a dynamic, audiovisual experience based on what is being edited on Wikipedia right now.
* [http://histography.io/ HistoryGraph] — an interactive timeline of world events based on Wikipedia articles.
* [https://www.google.com/intl/es419/insidesearch/features/search/knowledge.html Google's knowledge graph]


[[Category:DS4UX (Spring 2016)]]
[[Category:DS4UX (Spring 2016)]]
Please note that all contributions to CommunityData are considered to be released under the Attribution-Share Alike 3.0 Unported (see CommunityData:Copyrights for details). If you do not want your writing to be edited mercilessly and redistributed at will, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource. Do not submit copyrighted work without permission!

To protect the wiki against automated edit spam, we kindly ask you to solve the following CAPTCHA:

Cancel Editing help (opens in new window)