DS4UX (Spring 2016)/Panama Papers: Difference between revisions

From CommunityData
No edit summary
Line 58: Line 58:
=== Building queries in the API Sandbox ===
=== Building queries in the API Sandbox ===


Using the Wikipedia edit API sandbox...
;Using the Wikipedia edit API sandbox...
<ol start="1">
<li>When was the article about the Panama Papers created?</li>
<li>When was the most recent edit to the Panama Papers article?</li>
</ol>


How many views did Panama_Papers have…
# When was the article about the Panama Papers created?
# When was the most recent edit to the Panama Papers article?


<ol start="5">
<li>the day it was created?</li>
<li>the first week? </li>
</ol>


=== Building queries with Python <code>requests</code> ===
;User the Wikipedia page view API sandbox...
How many edits did it get in…
<ol start="9">
# How many views did Panama Papers have on the day it was created?
<li> the first 24 hours? </li>
# How many views did Panama Papers have yesterday?  
<li> the first week? </li>
</ol>


<li>Think of one or two articles that interest you. Which ones were created first?</li>
<li>Which have been edited most recently?</li>
<li>How does this compare to the articles that interest you? </li>
<li> How many edits did the articles that interest you get? </li>


=== Building queries with Python <code>requests</code> ===


# How many edits did Panama Papers get in the first 24 hours?
# How many views did Panama Papers have in the first week?


== Resources ==
== Resources ==

Revision as of 03:25, 27 April 2016

This page is a work in progress.
Wikipedia.png

In this project, we will explore a few ways to gather data using two Wikipedia APIs: one provides data related to edits, and the other provides data related to pageviews. Once we've done that, we will extend this to code to create our own datasets of Wikipedia edits or other data that we might be able to use to ask and answer questions in the final session.

Panama Papers

Goals

  • Get set up to build datasets with Wikipedia APIs
  • Have fun collecting different types of data from Wikipedia
  • Practice reading API documentation
  • Pracice testing API queries in an API Sandbox
  • Practice reading and extending other people's code

Download and test the Wikipedia API project

  1. Right click the following file, click "Save Target as..." or "Save link as...", and save it to your Desktop directory:

https://jtmorgan.net/ds4ux/week6/panama-papers.zip

  1. Find panama-papers.zip on your Desktop and double-click on it to "unzip" it. That will create a folder called panama-papers containing several files.
  2. In PowerShell or Terminal, navigate to the panama-papers directory and type:


Datasources

Wikipedia Edit API


Wikipedia Page View API

Exercises

Building queries in the API Sandbox

Using the Wikipedia edit API sandbox...
  1. When was the article about the Panama Papers created?
  2. When was the most recent edit to the Panama Papers article?


User the Wikipedia page view API sandbox...
  1. How many views did Panama Papers have on the day it was created?
  2. How many views did Panama Papers have yesterday?


Building queries with Python requests

  1. How many edits did Panama Papers get in the first 24 hours?
  2. How many views did Panama Papers have in the first week?

Resources