Wikipedia (CDSW): Difference between revisions

From CommunityData
(→‎Main Wikipedia API: Change documentation page to revisions api)
 
(34 intermediate revisions by 3 users not shown)
Line 1: Line 1:
[[File:Wikipedia.png|right|250px]]
[[File:Wikipedia.png|right|250px]]
__NOTOC__
__NOTOC__
== Building a Dataset using the Wikipedia API ==


In this project, we will explore a few ways to gather data using the Wikipedia API. Once we've done that, we will extend this to code to create our own datasets of Wikipedia edits or other data that we might be able to use to ask and answer questions in future sessions.
In this project, we will explore a few ways to gather data using the Wikipedia API. Once we've done that, we will extend this to code to create our own datasets of Wikipedia edits or other data that we might be able to use to ask and answer questions in the final session.


=== Goals ===
== Goals ==


* Get set up to build datasets with the Wikipedia API
* Get set up to build datasets with the Wikipedia API
Line 12: Line 11:
* Create a few collections of different types of data from Wikipedia that you can do research with in the final section
* Create a few collections of different types of data from Wikipedia that you can do research with in the final section


=== Download and test the Wikipedia project ===
== Download and test the Wikipedia API project ==


# Right click the following file, click "Save Target as..." or "Save link as...", and save it to your Desktop directory: http://FIXME
If you are confused by these steps, go back and refresh your memory with the [[Community Data Science Workshops (Fall 2015)/Day 0 setup and tutorial|Day 0 setup and tutorial]] and [[Community Data Science Workshops (Fall 2015)/Day 0 tutorial|Day 0 tutorial]]
# The ".zip" extension on the above file indicates that it is a compressed Zip archive. We need to "extract" its contents.
# Start up your terminal, navigate to the new directory you have unpacked called <code>wikipedia-cdsw.zip</code>, and then test the code by running:


python FIXME.py
(Estimated time: 10 minutes)


=== Important Background ===
===Download the Wikipedia API project===


* Wikipedia runs software called MediaWiki which exists on many other wikis
* Right click the following file, click "Save Target as..." or "Save link as...", and save it to your Desktop directory:
* You can go to the [http://en.wikipedia.org/w/api.php api page] on any wiki to show documentation or check out the [https://www.mediawiki.org/wiki/API:Main_page main documentation for the MediaWiki API on the MedaiWiki website].
https://github.com/CommunityDataScienceCollective/wikipedia-cdsw/archive/master.zip
<!--
=== Material We Will Cover ===


* The ".zip" extension on the above file indicates that it is a compressed Zip archive. We need to "extract" its contents. To do this, click on "Start", then "Computer" on Windows or open Finder and navigate to your Desktop directory if you are a Mac. Find <code>wikipedia-cdsw-master.zip</code> on your Desktop and double-click on it to "unzip" it. That will create a folder called <code>wikipedia-cdsw-master</code> containing several files.
===Test the Wikipedia API code===
Startup your jupyter notebook [[Python_in_Jupyter| as usual]]. Using jupyter open in your browser, navigate to the <code> wikipedia-cdsw-master</code> folder that you saved on your Desktop and open <code>wikipedia-test-anon_nonanon.ipynb</code>.
Run all the cells in the notebook.  Cell 4 should take some time and output messages like <code>pulling data iteration 0</code>. Cell 5 should output a dictionary of IP addresses and counts and cell 6 should output a dictionary of user names and counts. 
If you don't get this output or see any error messages ask a mentor for help right away.
== Topics to cover in the session ==
=== Main Wikipedia API ===
* explain [http://www.mediawiki.org/wiki/API:Main_page MediaWiki], exists on other wikis
* navigate to [http://en.wikipedia.org/w/api.php api page] and show the [https://www.mediawiki.org/wiki/API:Revisions documentation], point out examples
* introduce the [https://en.wikipedia.org/wiki/Special:ApiSandbox API sandbox] as a tool for building queries
* looking at the images within a page http://en.wikipedia.org/w/api.php?action=query&titles=Seattle&prop=images&imlimit=20&format=jsonfm
* looking at the images within a page http://en.wikipedia.org/w/api.php?action=query&titles=Seattle&prop=images&imlimit=20&format=jsonfm
* change the city with a custom URL
* change the city with a custom URL
* edit count http://en.wikipedia.org/w/api.php?action=query&list=users&ususers=Benjamin_Mako_Hill|Jtmorgan|Sj|Mindspillage&usprop=editcount&format=jsonfm
* edit count http://en.wikipedia.org/w/api.php?action=query&list=users&ususers=Benjamin_Mako_Hill|Jtmorgan|Sj|Koavf&usprop=editcount&format=jsonfm
* get the content of the main page http://en.wikipedia.org/w/api.php?format=json&action=query&titles=Main%20Page&prop=revisions&rvprop=content
* get the content of the main page http://en.wikipedia.org/w/api.php?format=json&action=query&titles=Main%20Page&prop=revisions&rvprop=content
* example programs: [http://mako.cc/teaching/2014/cdsw-autumn/wikipedia-raw1-unicode-problems-example.py wikipedia-raw1-unicode-problems-example.py] (note: this is an example of Unicode problems when running this on Windows), [http://mako.cc/teaching/2014/cdsw-autumn/wikipedia-raw2-mudslide-edit.py wikipedia-raw2-mudslide-edit.py]-->


=== Page View API ===
* Use [https://wikimedia.org/api/rest_v1/?doc#!/Pageviews_data the REST API]
* Explain that this API is a little different because it uses relative paths instead of parameters.
* Also note that this API is case-sensitive.
* Request for Panama Papers: https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wikipedia/all-access/user/Panama_Papers/daily/20160401/20160420


== Questions to answer ==
Warm up questions:


<ol start="1">
<li>When was the article about the Panama Papers created? (i.e., when was the first edit to the article made)</li>
<li>When was the most recent edit to the Panama Papers article?</li>
<li>Think of two articles that interest you. Which ones were created first?</li>
<li>Which have been edited most recently?</li>
</ol>


=== Questions to answer ===
Medium Difficulty Questions:
 
<ol start="5">
Warm up questions:
<li> Print the users who made revisions to an article of your choice (try an article about a current event) in the last day. </li>
 
<li> How many edits did user "Freeknowledgecreator" make in the last day? </li>
# When was the article about the Panama Papers created?
<li> How many total edits did user "Freeknowledgecreator" make?"</li>
# When was the most recent edit to the Panama Papers article?
</ol>
# Think of one or two articles that interest you. Which ones were created first?
# Which have been edited most recently?


How many views did Panama_Papers have…  
How many views did Panama_Papers have…  
<ol start="8">
<li>the day it was created?</li>
<li>the first week? </li>
<li>How does this compare to the articles that interest you? </li>
</ol>


# the day it was created? </li>
How many edits did it get in…
# the first week? </li>
<ol start="11">
# How does this compare to the articles that interest you? </li>
</ol>
<li> How many edits did it get in…
<ol type="a">
<li> the first 24 hours?  </li>
<li> the first 24 hours?  </li>
<li> the first week? </li>
<li> the first week? </li>
<li> How many edits did the articles that interest you get? </li>
<li> How many edits did the articles that interest you get? </li>
</ol>
</ol>
More difficult questions:
<ol start="14">
<li>Who made the total most edits to the article?</li>
<li>What’s the number of edits per day in the first two weeks of the article?</li>
<li>What’s the peak number of edits per hour of the article? When did it occur?</li>
<li>Who were the top editors during that hour?</li>
<li>What day did it have the most views, and how many views did it have?</li>
<li>How many views did it have per day?</li>
<li>How many views did it have per day on German Wikipedia?</li>
<li>Who’s the person named in the Panama papers with the most views to their Wikipedia page?</li>
</ol>
</ol>


=== More difficult questions ===
== Resources ==
# Who made the total most edits to the article?
# What’s the number of edits per day in the first two weeks of the article?
# What’s the peak number of edits per hour of the article? When did it occur?
# Who were the top editors during that hour?
# What day did it have the most views, and how many views did it have?
# Who’s the most person named in the Panama papers with the most views to their Wikipedia page?


=== Resources ===
* [https://en.wikipedia.org/w/api.php?action=help&modules=query API documentation for the query module]
* [https://www.mediawiki.org/wiki/API:Main_page Main MediaWiki API Documentation]
* [https://en.wikipedia.org/w/api.php Autogenerated API Documentation]
* [https://en.wikipedia.org/wiki/Special:ApiSandbox API Sandbox]
* [https://en.wikipedia.org/wiki/Special:ApiSandbox API Sandbox]
* [[Sample API queries]]
* [[Sample Wikipedia API queries]]
* [https://github.com/nettrom/wikipedia-session The session lecture notes (in Markdown) and python sources.]
[[Category:Spring_2016 series]]

Latest revision as of 23:40, 1 February 2020

Wikipedia.png


In this project, we will explore a few ways to gather data using the Wikipedia API. Once we've done that, we will extend this to code to create our own datasets of Wikipedia edits or other data that we might be able to use to ask and answer questions in the final session.

Goals[edit]

  • Get set up to build datasets with the Wikipedia API
  • Have fun collecting different types of data from Wikipedia
  • Practice reading and extending other people's code
  • Create a few collections of different types of data from Wikipedia that you can do research with in the final section

Download and test the Wikipedia API project[edit]

If you are confused by these steps, go back and refresh your memory with the Day 0 setup and tutorial and Day 0 tutorial

(Estimated time: 10 minutes)

Download the Wikipedia API project[edit]

  • Right click the following file, click "Save Target as..." or "Save link as...", and save it to your Desktop directory:

https://github.com/CommunityDataScienceCollective/wikipedia-cdsw/archive/master.zip

  • The ".zip" extension on the above file indicates that it is a compressed Zip archive. We need to "extract" its contents. To do this, click on "Start", then "Computer" on Windows or open Finder and navigate to your Desktop directory if you are a Mac. Find wikipedia-cdsw-master.zip on your Desktop and double-click on it to "unzip" it. That will create a folder called wikipedia-cdsw-master containing several files.

Test the Wikipedia API code[edit]

Startup your jupyter notebook as usual. Using jupyter open in your browser, navigate to the wikipedia-cdsw-master folder that you saved on your Desktop and open wikipedia-test-anon_nonanon.ipynb.

Run all the cells in the notebook. Cell 4 should take some time and output messages like pulling data iteration 0. Cell 5 should output a dictionary of IP addresses and counts and cell 6 should output a dictionary of user names and counts.

If you don't get this output or see any error messages ask a mentor for help right away.

Topics to cover in the session[edit]

Main Wikipedia API[edit]

Page View API[edit]

Questions to answer[edit]

Warm up questions:

  1. When was the article about the Panama Papers created? (i.e., when was the first edit to the article made)
  2. When was the most recent edit to the Panama Papers article?
  3. Think of two articles that interest you. Which ones were created first?
  4. Which have been edited most recently?

Medium Difficulty Questions:

  1. Print the users who made revisions to an article of your choice (try an article about a current event) in the last day.
  2. How many edits did user "Freeknowledgecreator" make in the last day?
  3. How many total edits did user "Freeknowledgecreator" make?"

How many views did Panama_Papers have…

  1. the day it was created?
  2. the first week?
  3. How does this compare to the articles that interest you?

How many edits did it get in…

  1. the first 24 hours?
  2. the first week?
  3. How many edits did the articles that interest you get?

More difficult questions:

  1. Who made the total most edits to the article?
  2. What’s the number of edits per day in the first two weeks of the article?
  3. What’s the peak number of edits per hour of the article? When did it occur?
  4. Who were the top editors during that hour?
  5. What day did it have the most views, and how many views did it have?
  6. How many views did it have per day?
  7. How many views did it have per day on German Wikipedia?
  8. Who’s the person named in the Panama papers with the most views to their Wikipedia page?

Resources[edit]