Community Data Science Workshops (Fall 2015)/Day 3 Lecture: Difference between revisions

From CommunityData
(i hate bears!)
 
(4 intermediate revisions by the same user not shown)
Line 1: Line 1:
Welcome to the Saturday lecture section of the Community Data Science Workshop Session 3! For about 140 minutes, we'll work through an example of a Python program end-to-end that answers of simple questions using data from the Wikipedia API via both a lecture and hand-on exercises.
== Resources ==
* [http://communitydata.cc/~mako/cdsw-au2015-lecture3-20151107.ogv Lecture Recording/Screencast] — The file is in [[:wiki:Theoro|OGV/Theora]] format. If you have trouble playing it, you can install the free software [https://www.videolan.org/vlc/index.html VLC software] which runs on Mac OSX, Windows and Linux and should be able to play the video. Keep in mind that there is lots of "dead" time where folks are working on things and getting help from mentors.
== Material for the lecture ==
== Material for the lecture ==


Line 27: Line 33:


* My philosophy about data analysis: ''use the tools you have''
* My philosophy about data analysis: ''use the tools you have''
* Four things in Python I have to teach you:
* Four things in Python I have to teach you now and one more thing later):
** while loops
** while loops
*** infinite loops
*** infinite loops
Line 34: Line 40:
** "\t".join()
** "\t".join()
** defining your own functions with <code>def foo(argument):</code> and <code>return bar</code>
** defining your own functions with <code>def foo(argument):</code> and <code>return bar</code>
** The <code>.update()</code> function that is associated with dictionaries.


'''Step 2: Walking through a Program'''
'''Step 2: Walking through a Program'''

Latest revision as of 03:48, 10 November 2015

Welcome to the Saturday lecture section of the Community Data Science Workshop Session 3! For about 140 minutes, we'll work through an example of a Python program end-to-end that answers of simple questions using data from the Wikipedia API via both a lecture and hand-on exercises.

Resources[edit]

  • Lecture Recording/Screencast — The file is in OGV/Theora format. If you have trouble playing it, you can install the free software VLC software which runs on Mac OSX, Windows and Linux and should be able to play the video. Keep in mind that there is lots of "dead" time where folks are working on things and getting help from mentors.

Material for the lecture[edit]

For the lecture, you will need two files. Download both of these to your computer by using right or control click on the link and then using Save as or Save link as. Keep track of where you put the files.

Overview of the day[edit]

  • Lecture
    • Our philosophy around data visualization
    • Introduce some new programming tools!
    • We're going to walk through some analysis of edits to Harry Potter in Wikipedia, start to finish
    • We'll focus on manipulating data in Python
    • Visualizing things in Google Docs
  • Lunch (vegetarian Greek!)
  • Project based work
  • Wrap-up!

Lecture outline[edit]

Step 1: Pre-Requisites

  • My philosophy about data analysis: use the tools you have
  • Four things in Python I have to teach you now and one more thing later):
    • while loops
      • infinite loops
      • loops with a greater than or less than
    • break / continue
    • "\t".join()
    • defining your own functions with def foo(argument): and return bar
    • The .update() function that is associated with dictionaries.

Step 2: Walking through a Program

  • Walk-through of get_hpwp_dataset.py
  • Look at dataset with more and/or in spreadsheet

Step 3: Loading Data Back In

  • Load data into Python
    • review of opening files
      • we can also open them for reading with open('file', 'r', encoding="utf-8")
    • csv.DictReader()
  • Basic counting: hpwp-minor.py
    • Answer question: What proportion of edits to Wikipedia Harry Potter articles are minor?
      • Count the number of minor edits and calculate proportion
  • Looking at time series data hpwp-trend.py
    • "Bin" data by day to generate the trend line
  • Exporting and visualizing data
    • Export dataset on edits over time
    • Export dataset on articles over users
    • Load data into Google Docs