Wikipedia (CDSW): Difference between revisions

From CommunityData
m (Benjamin Mako Hill moved page Wikipedia CDSW Project to Wikipedia (CDSW))
No edit summary
Line 1: Line 1:
[[File:Wikipedia.png|right|250px]]
__NOTOC__
In this project, we will explore a few ways to gather data using the Wikipedia API. Once we've done that, we will extend this to code to create our own datasets of Wikipedia edits or other data that we might be able to use to ask and answer questions in the final session.
== Goals ==
* Get set up to build datasets with the Wikipedia API
* Have fun collecting different types of data from Wikipedia
* Practice reading and extending other people's code
* Create a few collections of different types of data from Wikipedia that you can do research with in the final section
== Download and test the Wikipedia API project ==
If you are confused by these steps, go back and refresh your memory with the [[Community Data Science Workshops (Fall 2015)/Day 0 setup and tutorial|Day 0 setup and tutorial]] and [[Community Data Science Workshops (Fall 2015)/Day 0 tutorial|Day 0 tutorial]]
(Estimated time: 10 minutes)
===Download the Wikipedia API project===
* Right click the following file, click "Save Target as..." or "Save link as...", and save it to your Desktop directory:
https://github.com/nettrom/wikipedia-session/raw/master/wikipedia-data-examples.zip
* The ".zip" extension on the above file indicates that it is a compressed Zip archive. We need to "extract" its contents. To do this, click on "Start", then "Computer" on Windows or open Finder and navigate to your Desktop directory if you are a Mac. Find <code>wikipedia-data-examples.zip</code> on your Desktop and double-click on it to "unzip" it. That will create a folder called <code>wikipedia-data-examples</code> containing several files.
===Test the Wikipedia API code===
<div style="background-color:#CEE7DA; width:80%; padding:1.2em;">
'''On Windows'''
Start up PowerShell and navigate to the <code>Desktop\wikipedia-data-examples</code> directory where the Wikipedia API code lives. For example, if the Wikipedia API project is at <code>C:\Users\'''YOURUSERNAME'''\Desktop\wikipedia-data-examples</code>,
cd C:\Users\'''YOURUSERNAME'''\Desktop\wikipedia-data-examples
</div>
<div style="background-color:#D8E8FF; width:80%; padding:1.2em;">
'''On Mac'''
Start a command prompt and navigate to the Desktop/wikipedia-data-examples directory where the Wikipedia API code lives. For example, if the Wikipedia API project is at <code>~/Desktop/wikipedia-data-examples</code>,
cd ~/Desktop/wikipedia-data-examples
</div>
This will change you into the Wikipedia example code directory. Running <code>ls</code> will show you the source code files in that directory. One of the files is "<code>wiki-query-1ab.py</code>", which has a "<code>.py</code>" extension indicating that it is a Python script. Type:
python wiki-query-1ab.py
at the command prompt to execute the <code>wiki-query-1ab.py</code> Python script. Wait a little while while your computer connects to Wikipedia. You should see the result of a query from Wikipedia API on your screen. If you don't, let a staff member know.
== Example topics we might cover in the session ==
=== Main Wikipedia API ===
* explain [http://www.mediawiki.org/wiki/API:Main_page MediaWiki], exists on other wikis
* navigate to [http://en.wikipedia.org/w/api.php api page] and show the documentation, point out examples
* introduce the [https://en.wikipedia.org/wiki/Special:ApiSandbox API sandbox] as a tool for building queries
* looking at the images within a page http://en.wikipedia.org/w/api.php?action=query&titles=Seattle&prop=images&imlimit=20&format=jsonfm
* change the city with a custom URL
* edit count http://en.wikipedia.org/w/api.php?action=query&list=users&ususers=Benjamin_Mako_Hill|Jtmorgan|Sj|Koavf&usprop=editcount&format=jsonfm
* get the content of the main page http://en.wikipedia.org/w/api.php?format=json&action=query&titles=Main%20Page&prop=revisions&rvprop=content
=== Page View API ===
* Use [https://wikimedia.org/api/rest_v1/?doc#!/Pageviews_data the experimental API]
* Explain that this API is a little different because it uses relative paths instead of parameters.
* Also note that this API is case-sensitive.
* Request for Panama Papers: https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wikipedia/all-access/user/Panama_Papers/daily/20160401/20160420
== Resources ==
* [https://en.wikipedia.org/w/api.php?action=help&modules=query API documentation for the query module]
* [https://en.wikipedia.org/wiki/Special:ApiSandbox API Sandbox]
* [[Sample Wikipedia API queries]]
* [https://github.com/ben-zen/wikipedia-session The session lecture notes (in Markdown) and python sources.]
* [[Sample Wikipedia API questions]]
[[Category:Spring_2016 series]]
[[File:Wikipedia.png|right|250px]]
[[File:Wikipedia.png|right|250px]]
__NOTOC__
__NOTOC__

Revision as of 03:26, 23 April 2016


In this project, we will explore a few ways to gather data using the Wikipedia API. Once we've done that, we will extend this to code to create our own datasets of Wikipedia edits or other data that we might be able to use to ask and answer questions in the final session.

Goals

  • Get set up to build datasets with the Wikipedia API
  • Have fun collecting different types of data from Wikipedia
  • Practice reading and extending other people's code
  • Create a few collections of different types of data from Wikipedia that you can do research with in the final section

Download and test the Wikipedia API project

If you are confused by these steps, go back and refresh your memory with the Day 0 setup and tutorial and Day 0 tutorial

(Estimated time: 10 minutes)

Download the Wikipedia API project

  • Right click the following file, click "Save Target as..." or "Save link as...", and save it to your Desktop directory:

https://github.com/nettrom/wikipedia-session/raw/master/wikipedia-data-examples.zip

  • The ".zip" extension on the above file indicates that it is a compressed Zip archive. We need to "extract" its contents. To do this, click on "Start", then "Computer" on Windows or open Finder and navigate to your Desktop directory if you are a Mac. Find wikipedia-data-examples.zip on your Desktop and double-click on it to "unzip" it. That will create a folder called wikipedia-data-examples containing several files.

Test the Wikipedia API code

On Windows

Start up PowerShell and navigate to the Desktop\wikipedia-data-examples directory where the Wikipedia API code lives. For example, if the Wikipedia API project is at C:\Users\YOURUSERNAME\Desktop\wikipedia-data-examples,

cd C:\Users\YOURUSERNAME\Desktop\wikipedia-data-examples

On Mac

Start a command prompt and navigate to the Desktop/wikipedia-data-examples directory where the Wikipedia API code lives. For example, if the Wikipedia API project is at ~/Desktop/wikipedia-data-examples,

cd ~/Desktop/wikipedia-data-examples

This will change you into the Wikipedia example code directory. Running ls will show you the source code files in that directory. One of the files is "wiki-query-1ab.py", which has a ".py" extension indicating that it is a Python script. Type:

python wiki-query-1ab.py

at the command prompt to execute the wiki-query-1ab.py Python script. Wait a little while while your computer connects to Wikipedia. You should see the result of a query from Wikipedia API on your screen. If you don't, let a staff member know.

Example topics we might cover in the session

Main Wikipedia API

Page View API

Resources


Building a Dataset using the Wikipedia API

In this project, we will explore a few ways to gather data using the Wikipedia API. Once we've done that, we will extend this to code to create our own datasets of Wikipedia edits or other data that we might be able to use to ask and answer questions in future sessions.

Goals

  • Get set up to build datasets with the Wikipedia API
  • Have fun collecting different types of data from Wikipedia
  • Practice reading and extending other people's code
  • Create a few collections of different types of data from Wikipedia that you can do research with in the final section

Download and test the Wikipedia project

  1. Right click the following file, click "Save Target as..." or "Save link as...", and save it to your Desktop directory: http://FIXME
  2. The ".zip" extension on the above file indicates that it is a compressed Zip archive. We need to "extract" its contents.
  3. Start up your terminal, navigate to the new directory you have unpacked called wikipedia-cdsw.zip, and then test the code by running:
python FIXME.py

Important Background



Questions to answer

Warm up questions:

  1. When was the article about the Panama Papers created?
  2. When was the most recent edit to the Panama Papers article?
  3. Think of one or two articles that interest you. Which ones were created first?
  4. Which have been edited most recently?

How many views did Panama_Papers have…

  1. the day it was created?
  2. the first week?
  3. How does this compare to the articles that interest you?
  • How many edits did it get in…
    1. the first 24 hours?
    2. the first week?
    3. How many edits did the articles that interest you get?

    More difficult questions

    1. Who made the total most edits to the article?
    2. What’s the number of edits per day in the first two weeks of the article?
    3. What’s the peak number of edits per hour of the article? When did it occur?
    4. Who were the top editors during that hour?
    5. What day did it have the most views, and how many views did it have?
    6. Who’s the most person named in the Panama papers with the most views to their Wikipedia page?

    Resources