Community Data Science Workshops (Core)/Day 3 Lecture: Difference between revisions

From CommunityData
No edit summary
 
(12 intermediate revisions by 3 users not shown)
Line 1: Line 1:
{{Template:CDSW Header}}
Welcome to the Saturday lecture section of the Community Data Science Workshop Session 3! For about 140 minutes, we'll work through an example of a Python program end-to-end that answers of simple questions using data from the Wikipedia API via both a lecture and hand-on exercises.
Welcome to the Saturday lecture section of the Community Data Science Workshop Session 3! For about 140 minutes, we'll work through an example of a Python program end-to-end that answers of simple questions using data from the Wikipedia API via both a lecture and hand-on exercises.
'''Resources''':
* [https://communitydata.science/~mako/cdsw-wi2020-lecture3-20200215.ogv  Screencast/recording of the Winter 2020 lecture] (859MB) — The file should be viewable in Firefox and many other browsers. If you have trouble playing it, you can download [https://www.videolan.org/vlc/index.html the VLC media player] which will be a able to play it on Windows, OSX, or GNU/Linux.


== Material for the lecture ==
== Material for the lecture ==
Line 5: Line 11:
For the lecture, you will need two files. Download both of these to your computer by using right or control click on the link and then using ''Save as'' or ''Save link as''. Keep track of where you put the files.
For the lecture, you will need two files. Download both of these to your computer by using right or control click on the link and then using ''Save as'' or ''Save link as''. Keep track of where you put the files.


* http://mako.cc/teaching/2015/cdsw-autumn/harrypotter-wikipedia-cdsw.zip
* https://github.com/CommunityDataScienceCollective/harrypotter-wikipedia-cdsw/archive/master.zip
* http://communitydata.cc/~mako/hp_wiki.tsv
* http://communitydata.science/~mako/hp_wiki.tsv


== Overview of the day ==
== Overview of the day ==
Line 16: Line 22:
** We'll focus on manipulating data in Python
** We'll focus on manipulating data in Python
** Visualizing things in Google Docs
** Visualizing things in Google Docs
* Lunch (vegetarian Greek!)
* Lunch (Tofu Bánh mì!)
* Project based work
* Project based work
** Wikipedia
** Twitter
** Data.seattle.gov
** Data.seattle.gov
** [[Matplotlib]]
** Your own projects!
** Review Cafe
** Extension of morning material
* Wrap-up!
* Wrap-up!


Line 37: Line 43:
** defining your own functions with <code>def foo(argument):</code> and <code>return bar</code>
** defining your own functions with <code>def foo(argument):</code> and <code>return bar</code>
** The <code>.update()</code> function that is associated with dictionaries.
** The <code>.update()</code> function that is associated with dictionaries.
* opening and writing to a file using open()


'''Step 2: Walking through a Program'''
'''Step 2: Walking through a Program'''


* Walk-through of <code>get_hpwp_dataset.py</code>
* Walk-through of <code>build_harry_potter_dataset.ipynb</code>
* Look at dataset with <code>more</code> and/or in spreadsheet
* Look at dataset with Jupyter and/or in spreadsheet


'''Step 3: Loading Data Back In'''
'''Step 3: Loading Data Back In'''
Line 49: Line 56:
*** we can also open them for reading with <code>open('file', 'r', encoding="utf-8")</code>
*** we can also open them for reading with <code>open('file', 'r', encoding="utf-8")</code>
** csv.DictReader()
** csv.DictReader()
* Basic counting: <code>hpwp-minor.py</code>
* Basic counting: <code>harrypotter_anon_edits.ipynb</code>
** Answer question: ''What proportion of edits to Wikipedia Harry Potter articles are minor?''
** Answer question: ''What proportion of edits to Wikipedia Harry Potter articles are minor?''
*** Count the number of minor edits and calculate proportion
*** Count the number of minor edits and calculate proportion
* Looking at time series data <code>hpwp-trend.py</code>
* Looking at time series data <code>harrypotter_edit_trend.ipynb</code>
** "Bin" data by day to generate the trend line
** "Bin" data by day to generate the trend line
* Exporting and visualizing data
* Exporting and visualizing data
Line 58: Line 65:
** Export dataset on articles over users
** Export dataset on articles over users
** Load data into Google Docs
** Load data into Google Docs
== Older Resources ==
* [https://communitydata.cc/~mako/cdsw-sp2016-lecture3-20160507.ogv Screencast/recording of the 2015 lecture] (934MB) — The file should be viewable in Firefox and many other browsers. If you have trouble playing it, you can download [https://www.videolan.org/vlc/index.html the VLC media player] which will be a able to play it on Windows, OSX, or GNU/Linux.


[[Category:Shared_Content]]
[[Category:Shared_Content]]

Latest revision as of 21:32, 1 November 2022

Welcome to the Saturday lecture section of the Community Data Science Workshop Session 3! For about 140 minutes, we'll work through an example of a Python program end-to-end that answers of simple questions using data from the Wikipedia API via both a lecture and hand-on exercises.

Resources:

Material for the lecture[edit]

For the lecture, you will need two files. Download both of these to your computer by using right or control click on the link and then using Save as or Save link as. Keep track of where you put the files.

Overview of the day[edit]

  • Lecture
    • Our philosophy around data analysis and visualization
    • Introduce some new programming tools!
    • We're going to walk through some analysis of edits to Harry Potter in Wikipedia, start to finish
    • We'll focus on manipulating data in Python
    • Visualizing things in Google Docs
  • Lunch (Tofu Bánh mì!)
  • Project based work
    • Data.seattle.gov
    • Your own projects!
    • Review Cafe
    • Extension of morning material
  • Wrap-up!

Lecture outline[edit]

Step 1: Pre-Requisites

  • My philosophy about data analysis: use the tools you have
  • Four things in Python I have to teach you now and one more thing later):
    • while loops
      • infinite loops
      • loops with a greater than or less than
    • break / continue
    • "\t".join()
    • defining your own functions with def foo(argument): and return bar
    • The .update() function that is associated with dictionaries.
  • opening and writing to a file using open()

Step 2: Walking through a Program

  • Walk-through of build_harry_potter_dataset.ipynb
  • Look at dataset with Jupyter and/or in spreadsheet

Step 3: Loading Data Back In

  • Load data into Python
    • review of opening files
      • we can also open them for reading with open('file', 'r', encoding="utf-8")
    • csv.DictReader()
  • Basic counting: harrypotter_anon_edits.ipynb
    • Answer question: What proportion of edits to Wikipedia Harry Potter articles are minor?
      • Count the number of minor edits and calculate proportion
  • Looking at time series data harrypotter_edit_trend.ipynb
    • "Bin" data by day to generate the trend line
  • Exporting and visualizing data
    • Export dataset on edits over time
    • Export dataset on articles over users
    • Load data into Google Docs

Older Resources[edit]