Editing Wikipedia (CDSW)

From CommunityData

Warning: You are not logged in. Your IP address will be publicly visible if you make any edits. If you log in or create an account, your edits will be attributed to your username, along with other benefits.

The edit can be undone. Please check the comparison below to verify that this is what you want to do, and then publish the changes below to finish undoing the edit.

Latest revision Your text
Line 1: Line 1:
[[File:Wikipedia.png|right|250px]]
=== Questions to answer ===  
__NOTOC__
<ol>
 
<li> Warm Up
In this project, we will explore a few ways to gather data using the Wikipedia API. Once we've done that, we will extend this to code to create our own datasets of Wikipedia edits or other data that we might be able to use to ask and answer questions in the final session.
<ol type="a">
 
<li> When was the article about the Panama Papers created? </li>
== Goals ==
<li> When was the most recent edit to the Panama Papers article? </li>
 
<li> Think of one or two articles that interest you. Which ones were created first? </li>
* Get set up to build datasets with the Wikipedia API
<li> Which have been edited most recently?
* Have fun collecting different types of data from Wikipedia
* Practice reading and extending other people's code
* Create a few collections of different types of data from Wikipedia that you can do research with in the final section
 
== Download and test the Wikipedia API project ==
 
If you are confused by these steps, go back and refresh your memory with the [[Community Data Science Workshops (Fall 2015)/Day 0 setup and tutorial|Day 0 setup and tutorial]] and [[Community Data Science Workshops (Fall 2015)/Day 0 tutorial|Day 0 tutorial]]
 
(Estimated time: 10 minutes)
 
===Download the Wikipedia API project===
 
* Right click the following file, click "Save Target as..." or "Save link as...", and save it to your Desktop directory:
https://github.com/CommunityDataScienceCollective/wikipedia-cdsw/archive/master.zip
 
* The ".zip" extension on the above file indicates that it is a compressed Zip archive. We need to "extract" its contents. To do this, click on "Start", then "Computer" on Windows or open Finder and navigate to your Desktop directory if you are a Mac. Find <code>wikipedia-cdsw-master.zip</code> on your Desktop and double-click on it to "unzip" it. That will create a folder called <code>wikipedia-cdsw-master</code> containing several files.
 
===Test the Wikipedia API code===
 
Startup your jupyter notebook [[Python_in_Jupyter| as usual]]. Using jupyter open in your browser, navigate to the <code> wikipedia-cdsw-master</code> folder that you saved on your Desktop and open <code>wikipedia-test-anon_nonanon.ipynb</code>.
 
Run all the cells in the notebook.  Cell 4 should take some time and output messages like <code>pulling data iteration 0</code>. Cell 5 should output a dictionary of IP addresses and counts and cell 6 should output a dictionary of user names and counts. 
 
If you don't get this output or see any error messages ask a mentor for help right away.
 
== Topics to cover in the session ==
 
=== Main Wikipedia API ===
* explain [http://www.mediawiki.org/wiki/API:Main_page MediaWiki], exists on other wikis
* navigate to [http://en.wikipedia.org/w/api.php api page] and show the [https://www.mediawiki.org/wiki/API:Revisions documentation], point out examples
* introduce the [https://en.wikipedia.org/wiki/Special:ApiSandbox API sandbox] as a tool for building queries
* looking at the images within a page http://en.wikipedia.org/w/api.php?action=query&titles=Seattle&prop=images&imlimit=20&format=jsonfm
* change the city with a custom URL
* edit count http://en.wikipedia.org/w/api.php?action=query&list=users&ususers=Benjamin_Mako_Hill|Jtmorgan|Sj|Koavf&usprop=editcount&format=jsonfm
* get the content of the main page http://en.wikipedia.org/w/api.php?format=json&action=query&titles=Main%20Page&prop=revisions&rvprop=content
 
=== Page View API ===
* Use [https://wikimedia.org/api/rest_v1/?doc#!/Pageviews_data the REST API]
* Explain that this API is a little different because it uses relative paths instead of parameters.
* Also note that this API is case-sensitive.
* Request for Panama Papers: https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wikipedia/all-access/user/Panama_Papers/daily/20160401/20160420
 
== Questions to answer ==
Warm up questions:
 
<ol start="1">
<li>When was the article about the Panama Papers created? (i.e., when was the first edit to the article made)</li>
<li>When was the most recent edit to the Panama Papers article?</li>  
<li>Think of two articles that interest you. Which ones were created first?</li>
<li>Which have been edited most recently?</li>
</ol>
</ol>
</li>


Medium Difficulty Questions:
<li> How many views did Panama_Papers have…  
<ol start="5">
<ol type="a">
<li> Print the users who made revisions to an article of your choice (try an article about a current event) in the last day. </li>
<li> the day it was created? </li>
<li> How many edits did user "Freeknowledgecreator" make in the last day? </li>
<li> the first week? </li>
<li> How many total edits did user "Freeknowledgecreator" make?"</li>
<li> How does this compare to the articles that interest you? </li>
</ol>
 
How many views did Panama_Papers have…  
<ol start="8">
<li>the day it was created?</li>
<li>the first week? </li>
<li>How does this compare to the articles that interest you? </li>
</ol>
</ol>
 
<li> How many edits did it get in…
How many edits did it get in…
<ol type="a">
<ol start="11">
<li> the first 24 hours?  </li>
<li> the first 24 hours?  </li>
<li> the first week? </li>
<li> the first week? </li>
<li> How many edits did the articles that interest you get? </li>
<li> How many edits did the articles that interest you get? </li>
</ol>
</ol>
More difficult questions:
<ol start="14">
<li>Who made the total most edits to the article?</li>
<li>What’s the number of edits per day in the first two weeks of the article?</li>
<li>What’s the peak number of edits per hour of the article? When did it occur?</li>
<li>Who were the top editors during that hour?</li>
<li>What day did it have the most views, and how many views did it have?</li>
<li>How many views did it have per day?</li>
<li>How many views did it have per day on German Wikipedia?</li>
<li>Who’s the person named in the Panama papers with the most views to their Wikipedia page?</li>
</ol>
</ol>


== Resources ==
=== More difficult questions ===
 
# Who made the total most edits to the article?
* [https://en.wikipedia.org/w/api.php?action=help&modules=query API documentation for the query module]
# What’s the number of edits per day in the first two weeks of the article?
* [https://en.wikipedia.org/wiki/Special:ApiSandbox API Sandbox]
# What’s the peak number of edits per hour of the article? When did it occur?
* [[Sample Wikipedia API queries]]
# Who were the top editors during that hour?
* [https://github.com/nettrom/wikipedia-session The session lecture notes (in Markdown) and python sources.]
# What day did it have the most views, and how many views did it have?
[[Category:Spring_2016 series]]
# Who’s the most person named in the Panama papers with the most views to their Wikipedia page?
Please note that all contributions to CommunityData are considered to be released under the Attribution-Share Alike 3.0 Unported (see CommunityData:Copyrights for details). If you do not want your writing to be edited mercilessly and redistributed at will, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource. Do not submit copyrighted work without permission!

To protect the wiki against automated edit spam, we kindly ask you to solve the following CAPTCHA:

Cancel Editing help (opens in new window)