DS4UX (Spring 2016)/Wikipedia API: Difference between revisions

From CommunityData
(Created page with "right|250px __NOTOC__ == Building a Dataset using the Wikipedia API == In this project, we will explore a few ways to gather data using the Wikipedia A...")
 
 
(24 intermediate revisions by the same user not shown)
Line 1: Line 1:
[[File:Wikipedia.png|right|250px]]
[[File:Wikipedia.png|right|250px]]
__NOTOC__
 
== Building a Dataset using the Wikipedia API ==
== Getting data from Wikipedia using the MediaWiki API ==


In this project, we will explore a few ways to gather data using the Wikipedia API. Once we've done that, we will extend this to code to create our own datasets of Wikipedia edits or other data that we might be able to use to ask and answer questions in future sessions.
In this project, we will explore a few ways to gather data using the Wikipedia API. Once we've done that, we will extend this to code to create our own datasets of Wikipedia edits or other data that we might be able to use to ask and answer questions in future sessions.
Line 9: Line 9:
* Get set up to build datasets with the Wikipedia API
* Get set up to build datasets with the Wikipedia API
* Have fun collecting different types of data from Wikipedia
* Have fun collecting different types of data from Wikipedia
* Practice reading API documentation
* Pracice testing API queries
* Practice reading and extending other people's code
* Practice reading and extending other people's code
* Create a few collections of different types of data from Wikipedia that you can do research with in the final section
 


=== Download and test the Wikipedia project ===
=== Download and test the Wikipedia project ===


# Right click the following file, click "Save Target as..." or "Save link as...", and save it to your Desktop directory: FIXME
<big>'''[http://jtmorgan.net/ds4ux/week5/wikipedia-data-examples.zip Click here to download the Wikipedia API project]'''</big>


# The ".zip" extension on the above file indicates that it is a compressed Zip archive. We need to "extract" its contents.  
# The ".zip" extension on the above file indicates that it is a compressed Zip archive. We need to "extract" its contents.  
Line 21: Line 23:
  python wikipedia1-1.py
  python wikipedia1-1.py


=== Important Background ===
== What kind of data can you get from the MediaWiki API? ==
 
* As you probably already know, Wikipedia is a website that anyone can edit. 
* Wikipedia runs software called MediaWiki which exists on many other wikis.
* A wiki is a type of website (like "blog")
 
 
 
=== Wikipedia basics: edits, pages, and editors ===
 
Every time an editor (like you!) makes an edit (or ''revision'') to a page on Wikipedia, some information about that edit (called metadata) is saved, along with the text of the page after the edit was made. All of the edits ever made to any page on Wikipedia can be viewed in that page's ''revision history''. When you click the 'view history' tab at the top-right of any Wikipedia article to see metadata about all the edits made to that article, in reverse chronological order. You can also click links on this list to see more information about what the page looked like after the edit, to see what was changed, and to find out more about the person who made the edit.
 
You can access all of this information (and much more) by ''querying'' the MediaWiki API. Go to the [http://en.wikipedia.org/w/api.php api page] on any wiki to show API documentation or check out the [https://www.mediawiki.org/wiki/API:Main_page main documentation for the MediaWiki API on the MedaiWiki website].
 
In the sections below, I will describe some types of metadata that you can gather about edits, editors, and pages through the API.
 
=== Revision metadata ===
 
You can get metadata about a particular revision through the <code>rvprop=</code> parameter. Here are some of the things you can get ([Click here for the full list https://www.mediawiki.org/wiki/API:Revisions]):
 
* ids:
:* revid: current revision of the page
:* parentid: previous revision of the page
* timestamp: The date and time the revision was made
* user: The username of the editor who made the revision (or their IP address, if they aren't logged in)
* userid: A unique id of the editor
* comment: The edit comment the editor made when they committed the revision
* size: The size of the revision text in bytes.
* content: The revision content.
* tags: Any tags for this revision, such as whether the edit was made on a mobile device
 
=== Editor metadata ===
 
* usercontribs: Get the list of contributions made by a user: https://www.mediawiki.org/wiki/API:Usercontribs
 
* users: Returns information about a list of users: https://www.mediawiki.org/wiki/API:Users
 
=== Page text and metadata ===
* categories: categories a page is in: https://www.mediawiki.org/wiki/API:Categories#Parameters
* contributors: people who have edited a page: https://www.mediawiki.org/wiki/API:Contributors
* images: images (and audio + video) files on a page: https://www.mediawiki.org/wiki/API:Images
* links: the links on a page: https://www.mediawiki.org/wiki/API:Links
* categorymembers: all of the pages that are in a particular category: https://www.mediawiki.org/wiki/API:Categorymembers
* pageprops: basic information about the page: https://www.mediawiki.org/wiki/API:Pageprops


* Wikipedia runs software called MediaWiki which exists on many other wikis
* You can go to the [http://en.wikipedia.org/w/api.php api page] on any wiki to show documentation or check out the [https://www.mediawiki.org/wiki/API:Main_page main documentation for the MediaWiki API on the MedaiWiki website].
<!--
=== Material We Will Cover ===


* looking at the images within a page http://en.wikipedia.org/w/api.php?action=query&titles=Seattle&prop=images&imlimit=20&format=jsonfm
=== Examples of API queries ===
 
* looking at the images within a page (try this with another city by changing the value of <code>titles=</code>!): http://en.wikipedia.org/w/api.php?action=query&titles=Seattle&prop=images&imlimit=20&format=jsonfm
:* change the city with a custom URL
:* change the city with a custom URL
* getting individual users' edit counts: http://en.wikipedia.org/w/api.php?action=query&list=users&ususers=Benjamin_Mako_Hill|Jtmorgan|Moonriddengirl|Keilana&usprop=editcount&format=jsonfm
* getting individual users' edit counts: http://en.wikipedia.org/w/api.php?action=query&list=users&ususers=Benjamin_Mako_Hill|Jtmorgan|Moonriddengirl|Keilana&usprop=editcount&format=jsonfm
* get the content of the main page http://en.wikipedia.org/w/api.php?format=json&action=query&titles=Main%20Page&prop=revisions&rvprop=content
* get the content of the main page http://en.wikipedia.org/w/api.php?format=json&action=query&titles=Main%20Page&prop=revisions&rvprop=content
* example programs: [http://mako.cc/teaching/2014/cdsw-autumn/wikipedia-raw1-unicode-problems-example.py wikipedia-raw1-unicode-problems-example.py] (note: this is an example of Unicode problems when running this on Windows), [http://mako.cc/teaching/2014/cdsw-autumn/wikipedia-raw2-mudslide-edit.py wikipedia-raw2-mudslide-edit.py]-->


=== Resources ===
There are several other interesting examples [[Sample_Wikipedia_API_queries|available on the Sample API queries page]].
 
== Resources ==
=== API  ===
* [https://www.mediawiki.org/wiki/API:Main_page Main MediaWiki API Documentation]
* [https://www.mediawiki.org/wiki/API:Main_page Main MediaWiki API Documentation]
* [https://en.wikipedia.org/w/api.php Autogenerated API Documentation]
* [https://en.wikipedia.org/w/api.php Autogenerated API Documentation]
* [https://en.wikipedia.org/wiki/Special:ApiSandbox API Sandbox]
* [https://en.wikipedia.org/wiki/Special:ApiSandbox API Sandbox]
* [[Sample API queries]]
* [[Sample API queries]]
=== Research using Wikipedia data ===
* ''[https://www.research.ibm.com/visual/projects/history_flow/ HistoryFlow]'' — A colorful visualization of the development of Wikipedia articles over time.
* [http://www.brianckeegan.com/papers/CSCW_2015.pdf ‘Is’ to ‘Was’: Coordination and Commemoration on Posthumous Wikipedia Biographies] — an exploration of editing patterns around Wikipedia articles about people who have recently died.
* [http://www.brianckeegan.com/papers/ICS_2015.pdf WikiWorthy: Judging a Candidate’s Notability in the Community] — A study that uses the editing activity on Wikipedia articles about political candidates as a predictor of election success.
=== Websites that use the MediaWiki API ===
* [http://listen.hatnote.com/ Listen to Wikipedia] — a dynamic, audiovisual experience based on what is being edited on Wikipedia right now.
* [http://histography.io/ HistoryGraph] — an interactive timeline of world events based on Wikipedia articles.
* [https://www.google.com/intl/es419/insidesearch/features/search/knowledge.html Google's knowledge graph]


[[Category:DS4UX (Spring 2016)]]
[[Category:DS4UX (Spring 2016)]]

Latest revision as of 23:47, 25 April 2016

Wikipedia.png

Getting data from Wikipedia using the MediaWiki API[edit]

In this project, we will explore a few ways to gather data using the Wikipedia API. Once we've done that, we will extend this to code to create our own datasets of Wikipedia edits or other data that we might be able to use to ask and answer questions in future sessions.

Goals[edit]

  • Get set up to build datasets with the Wikipedia API
  • Have fun collecting different types of data from Wikipedia
  • Practice reading API documentation
  • Pracice testing API queries
  • Practice reading and extending other people's code


Download and test the Wikipedia project[edit]

Click here to download the Wikipedia API project

  1. The ".zip" extension on the above file indicates that it is a compressed Zip archive. We need to "extract" its contents.
  2. Start up your terminal, navigate to the new directory you have unpacked called wikipedia-data-examples.zip, and then test the code by running:
python wikipedia1-1.py

What kind of data can you get from the MediaWiki API?[edit]

  • As you probably already know, Wikipedia is a website that anyone can edit.
  • Wikipedia runs software called MediaWiki which exists on many other wikis.
  • A wiki is a type of website (like "blog")


Wikipedia basics: edits, pages, and editors[edit]

Every time an editor (like you!) makes an edit (or revision) to a page on Wikipedia, some information about that edit (called metadata) is saved, along with the text of the page after the edit was made. All of the edits ever made to any page on Wikipedia can be viewed in that page's revision history. When you click the 'view history' tab at the top-right of any Wikipedia article to see metadata about all the edits made to that article, in reverse chronological order. You can also click links on this list to see more information about what the page looked like after the edit, to see what was changed, and to find out more about the person who made the edit.

You can access all of this information (and much more) by querying the MediaWiki API. Go to the api page on any wiki to show API documentation or check out the main documentation for the MediaWiki API on the MedaiWiki website.

In the sections below, I will describe some types of metadata that you can gather about edits, editors, and pages through the API.

Revision metadata[edit]

You can get metadata about a particular revision through the rvprop= parameter. Here are some of the things you can get ([Click here for the full list https://www.mediawiki.org/wiki/API:Revisions]):

  • ids:
  • revid: current revision of the page
  • parentid: previous revision of the page
  • timestamp: The date and time the revision was made
  • user: The username of the editor who made the revision (or their IP address, if they aren't logged in)
  • userid: A unique id of the editor
  • comment: The edit comment the editor made when they committed the revision
  • size: The size of the revision text in bytes.
  • content: The revision content.
  • tags: Any tags for this revision, such as whether the edit was made on a mobile device

Editor metadata[edit]

Page text and metadata[edit]


Examples of API queries[edit]

  • change the city with a custom URL

There are several other interesting examples available on the Sample API queries page.

Resources[edit]

API[edit]

Research using Wikipedia data[edit]

Websites that use the MediaWiki API[edit]