Community Data Science Course (Spring 2016)/Wikipedia API projects

From CommunityData
< Community Data Science Course (Spring 2016)
Revision as of 02:07, 20 April 2016 by Guyrt (talk | contribs) (Created page with "right|250px __NOTOC__ == Building a Dataset using the Wikipedia API == In this project, we will explore a few ways to gather data using the Wikipedia A...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Wikipedia.png

Building a Dataset using the Wikipedia API[edit]

In this project, we will explore a few ways to gather data using the Wikipedia API. Once we've done that, we will extend this to code to create our own datasets of Wikipedia edits or other data that we might be able to use to ask and answer questions in future sessions.

Goals[edit]

  • Get set up to build datasets with the Wikipedia API
  • Have fun collecting different types of data from Wikipedia
  • Practice reading and extending other people's code
  • Create a few collections of different types of data from Wikipedia that you can do research with in the final section

Download and test the Wikipedia project[edit]

  1. Right click the following file, click "Save Target as..." or "Save link as...", and save it to your Desktop directory: http://mako.cc/teaching/2015/community_data_science/wikipedia-data-examples.zip
  2. The ".zip" extension on the above file indicates that it is a compressed Zip archive. We need to "extract" its contents.
  3. Start up your terminal, navigate to the new directory you have unpacked called wikipedia-data-examples.zip, and then test the code by running:
python wikipedia1-1.py

Important Background[edit]

Resources[edit]