Community Data Science Course (Spring 2015)/Wikipedia API projects: Difference between revisions

From CommunityData
 
(5 intermediate revisions by the same user not shown)
Line 14: Line 14:
=== Download and test the Wikipedia project ===
=== Download and test the Wikipedia project ===


# Right click the following file, click "Save Target as..." or "Save link as...", and save it to your Desktop directory: http://mako.cc/teaching/2015/community_data_science/WikipediaAPI.zip
# Right click the following file, click "Save Target as..." or "Save link as...", and save it to your Desktop directory: http://mako.cc/teaching/2015/community_data_science/wikipedia-data-examples.zip
# The ".zip" extension on the above file indicates that it is a compressed Zip archive. We need to "extract" its contents.  
# The ".zip" extension on the above file indicates that it is a compressed Zip archive. We need to "extract" its contents.  
# Start up your terminal, navigate to the new directory you have unpacked, and then test the code by running:
# Start up your terminal, navigate to the new directory you have unpacked called <code>wikipedia-data-examples.zip</code>, and then test the code by running:


  python wikipedia1-1.py
  python wikipedia1-1.py


=== Important Background ===
* Wikipedia runs software called MediaWiki which exists on many other wikis
* You can go to the [http://en.wikipedia.org/w/api.php api page] on any wiki to show documentation or check out the [https://www.mediawiki.org/wiki/API:Main_page main documentation for the MediaWiki API on the MedaiWiki website].
<!--
<!--
=== Material We Will Cover ===
=== Material We Will Cover ===


* explain MediaWiki which exists on many other wikis
* Navigate to [http://en.wikipedia.org/w/api.php api page] and show the documentation, point out examples
* introduce the [https://en.wikipedia.org/wiki/Special:ApiSandbox API sandbox] as a tool for building queries
* looking at the images within a page http://en.wikipedia.org/w/api.php?action=query&titles=Seattle&prop=images&imlimit=20&format=jsonfm
* looking at the images within a page http://en.wikipedia.org/w/api.php?action=query&titles=Seattle&prop=images&imlimit=20&format=jsonfm
* change the city with a custom URL
* change the city with a custom URL
* edit count http://en.wikipedia.org/w/api.php?action=query&list=users&ususers=Benjamin_Mako_Hill|Jtmorgan|Sj|Mindspillage&usprop=editcount&format=jsonfm
* edit count http://en.wikipedia.org/w/api.php?action=query&list=users&ususers=Benjamin_Mako_Hill|Jtmorgan|Sj|Mindspillage&usprop=editcount&format=jsonfm
* get the content of the main page http://en.wikipedia.org/w/api.php?format=json&action=query&titles=Main%20Page&prop=revisions&rvprop=content
* get the content of the main page http://en.wikipedia.org/w/api.php?format=json&action=query&titles=Main%20Page&prop=revisions&rvprop=content
* example programs: [http://mako.cc/teaching/2014/cdsw-autumn/wikipedia-raw1-unicode-problems-example.py wikipedia-raw1-unicode-problems-example.py] (note: this is an example of Unicode problems when running this on Windows), [http://mako.cc/teaching/2014/cdsw-autumn/wikipedia-raw2-mudslide-edit.py wikipedia-raw2-mudslide-edit.py]
* example programs: [http://mako.cc/teaching/2014/cdsw-autumn/wikipedia-raw1-unicode-problems-example.py wikipedia-raw1-unicode-problems-example.py] (note: this is an example of Unicode problems when running this on Windows), [http://mako.cc/teaching/2014/cdsw-autumn/wikipedia-raw2-mudslide-edit.py wikipedia-raw2-mudslide-edit.py]-->
 
-->


=== Resources ===
=== Resources ===
Line 39: Line 38:
* [https://en.wikipedia.org/wiki/Special:ApiSandbox API Sandbox]
* [https://en.wikipedia.org/wiki/Special:ApiSandbox API Sandbox]
* [[Sample API queries]]
* [[Sample API queries]]
* Example that saves command-line output into a text file: <code>python wikipedia-raw2-mudslide-edit.py > OsoRevisionData.txt</code>

Latest revision as of 02:30, 21 April 2015

Wikipedia.png

Building a Dataset using the Wikipedia API[edit]

In this project, we will explore a few ways to gather data using the Wikipedia API. Once we've done that, we will extend this to code to create our own datasets of Wikipedia edits or other data that we might be able to use to ask and answer questions in future sessions.

Goals[edit]

  • Get set up to build datasets with the Wikipedia API
  • Have fun collecting different types of data from Wikipedia
  • Practice reading and extending other people's code
  • Create a few collections of different types of data from Wikipedia that you can do research with in the final section

Download and test the Wikipedia project[edit]

  1. Right click the following file, click "Save Target as..." or "Save link as...", and save it to your Desktop directory: http://mako.cc/teaching/2015/community_data_science/wikipedia-data-examples.zip
  2. The ".zip" extension on the above file indicates that it is a compressed Zip archive. We need to "extract" its contents.
  3. Start up your terminal, navigate to the new directory you have unpacked called wikipedia-data-examples.zip, and then test the code by running:
python wikipedia1-1.py

Important Background[edit]

Resources[edit]