DS4UX (Spring 2016)/Panama Papers: Difference between revisions

From CommunityData
No edit summary
No edit summary
Line 21: Line 21:
In this project, we will explore a few ways to gather data using two Wikipedia APIs: one provides data related to edits, and the other provides data related to pageviews. Once we've done that, we will extend this to code to create our own datasets of Wikipedia edits or other data that we might be able to use to ask and answer questions in the final session.
In this project, we will explore a few ways to gather data using two Wikipedia APIs: one provides data related to edits, and the other provides data related to pageviews. Once we've done that, we will extend this to code to create our own datasets of Wikipedia edits or other data that we might be able to use to ask and answer questions in the final session.


''[[w:Panama_Papers| Panama Papers]]''
== Goals ==
== Goals ==


Line 42: Line 43:
-->
-->


== Example topics we might cover in the session ==
== Datasources ==


=== Main Wikipedia API ===
=== Wikipedia Edit API ===
* explain [http://www.mediawiki.org/wiki/API:Main_page MediaWiki], exists on other wikis
* the [https://en.wikipedia.org/wiki/Special:ApiSandbox API sandbox] - a tool for building queries
* navigate to [http://en.wikipedia.org/w/api.php api page] and show the documentation, point out examples
* introduce the [https://en.wikipedia.org/wiki/Special:ApiSandbox API sandbox] as a tool for building queries
* looking at the images within a page http://en.wikipedia.org/w/api.php?action=query&titles=Seattle&prop=images&imlimit=20&format=jsonfm
* change the city with a custom URL
* edit count http://en.wikipedia.org/w/api.php?action=query&list=users&ususers=Benjamin_Mako_Hill|Jtmorgan|Sj|Koavf&usprop=editcount&format=jsonfm
* get the content of the main page http://en.wikipedia.org/w/api.php?format=json&action=query&titles=Main%20Page&prop=revisions&rvprop=content


=== Page View API ===
 
=== Wikipedia Page View API ===
* Use [https://wikimedia.org/api/rest_v1/?doc#!/Pageviews_data the experimental API]
* Use [https://wikimedia.org/api/rest_v1/?doc#!/Pageviews_data the experimental API]
* Explain that this API is a little different because it uses relative paths instead of parameters.  
* This API is a little different because it uses relative paths instead of parameters.  
* Also note that this API is case-sensitive.  
* This API is case-sensitive.  
* Request for Panama Papers: https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wikipedia/all-access/user/Panama_Papers/daily/20160401/20160420
* Request for Panama Papers: https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wikipedia/all-access/user/Panama_Papers/daily/20160401/20160420


== Questions to answer ==
== Exercises ==
 
=== Building queries in the API Sandbox ===
Warm up questions:


Using the Wikipedia edit API sandbox...
<ol start="1">
<ol start="1">
<li>When was the article about the Panama Papers created?</li>
<li>When was the article about the Panama Papers created?</li>
<li>When was the most recent edit to the Panama Papers article?</li>
<li>When was the most recent edit to the Panama Papers article?</li>
<li>Think of one or two articles that interest you. Which ones were created first?</li>
<li>Which have been edited most recently?</li>
</ol>
</ol>


Line 75: Line 69:
<li>the day it was created?</li>
<li>the day it was created?</li>
<li>the first week? </li>
<li>the first week? </li>
<li>How does this compare to the articles that interest you? </li>
</ol>
</ol>


=== Building queries with Python <code>requests</code> ===
How many edits did it get in…
How many edits did it get in…
<ol start="9">
<ol start="9">
<li> the first 24 hours?  </li>
<li> the first 24 hours?  </li>
<li> the first week? </li>
<li> the first week? </li>
</ol>
<li>Think of one or two articles that interest you. Which ones were created first?</li>
<li>Which have been edited most recently?</li>
<li>How does this compare to the articles that interest you? </li>
<li> How many edits did the articles that interest you get? </li>
<li> How many edits did the articles that interest you get? </li>
</ol>


More difficult questions:


<ol start="12">
<li>Who made the total most edits to the article?</li>
<li>What’s the number of edits per day in the first two weeks of the article?</li>
<li>What’s the peak number of edits per hour of the article? When did it occur?</li>
<li>Who were the top editors during that hour?</li>
<li>What day did it have the most views, and how many views did it have?</li>
<li>How many views did it have per day?</li>
<li>How many views did it have per day on German Wikipedia?</li>
<li>Who’s the person named in the Panama papers with the most views to their Wikipedia page?</li>
</ol>


== Resources ==
== Resources ==

Revision as of 03:16, 27 April 2016

This page is a work in progress.
Wikipedia.png

In this project, we will explore a few ways to gather data using two Wikipedia APIs: one provides data related to edits, and the other provides data related to pageviews. Once we've done that, we will extend this to code to create our own datasets of Wikipedia edits or other data that we might be able to use to ask and answer questions in the final session.

Panama Papers

Goals

  • Get set up to build datasets with Wikipedia APIs
  • Have fun collecting different types of data from Wikipedia
  • Practice reading API documentation
  • Pracice testing API queries in an API Sandbox
  • Practice reading and extending other people's code

Download and test the Wikipedia API project

  1. Right click the following file, click "Save Target as..." or "Save link as...", and save it to your Desktop directory:

https://jtmorgan.net/ds4ux/week6/panama-papers.zip

  1. Find panama-papers.zip on your Desktop and double-click on it to "unzip" it. That will create a folder called panama-papers containing several files.
  2. In PowerShell or Terminal, navigate to the panama-papers directory and type:


Datasources

Wikipedia Edit API


Wikipedia Page View API

Exercises

Building queries in the API Sandbox

Using the Wikipedia edit API sandbox...

  1. When was the article about the Panama Papers created?
  2. When was the most recent edit to the Panama Papers article?

How many views did Panama_Papers have…

  1. the day it was created?
  2. the first week?

Building queries with Python requests

How many edits did it get in…

  1. the first 24 hours?
  2. the first week?
  • Think of one or two articles that interest you. Which ones were created first?
  • Which have been edited most recently?
  • How does this compare to the articles that interest you?
  • How many edits did the articles that interest you get?
  • Resources