Community Data Science Workshops (Spring 2015)/Day 2 Projects/Wikipedia
From CommunityData
Building a Dataset using the Wikipedia API
In this project, we will explore a few ways to gather data using the Wikipedia API. Once we've done that, we will extend this to code to create our own datasets of Wikipedia edits or other data that we might be able to use to ask and answer questions in the final session.
Goals
- Get set up to build datasets with the Wikipedia API
- Have fun collecting different types of data from Wikipedia
- Practice reading and extending other people's code
- Create a few collections of different types of data from Wikipedia that you can do research with in the final section
Download and test the Wikipedia project
If you are confused by these steps, go back and refresh your memory with the Day 0 setup and tutorial and Day 0 tutorial
(Estimated time: 10 minutes)
Example topics to cover in Lecture
- explain MediaWiki, exists on other wikis
- navigate to api page and show the documentation, point out examples
- introduce the API sandbox as a tool for building queries
- looking at the images within a page http://en.wikipedia.org/w/api.php?action=query&titles=Seattle&prop=images&imlimit=20&format=jsonfm
- change the city with a custom URL
- edit count http://en.wikipedia.org/w/api.php?action=query&list=users&ususers=Benjamin_Mako_Hill%7CJtmorgan%7CSj%7CMindspillage&usprop=editcount&format=jsonfm
- get the content of the main page http://en.wikipedia.org/w/api.php?format=json&action=query&titles=Main%20Page&prop=revisions&rvprop=content
- example programs: wikipedia-raw1-unicode-problems-example.py (note: this is an example of Unicode problems when running this on Windows), wikipedia-raw2-mudslide-edit.py
Resources
- API documentation for the query module
- API Sandbox
- Sample API queries
- Example that saves command-line output into a text file:
python wikipedia-raw2-mudslide-edit.py > OsoRevisionData.txt