DS4UX (Spring 2016)/Day 5 lecture: Difference between revisions

From CommunityData
(Created page with "<div style="font-family:Rockwell,'Courier Bold',Courier,Georgia,'Times New Roman',Times,serif; min-width:10em;"> <div style="float:left; width:100%; margin-right:2%;"> {{Link/...")
 
 
(10 intermediate revisions by 2 users not shown)
Line 1: Line 1:
<div style="font-family:Rockwell,'Courier Bold',Courier,Georgia,'Times New Roman',Times,serif; min-width:10em;">
<div style="float:left; width:100%; margin-right:2%;">
{{Link/Graphic/Main/2
|highlight color= 27666b
|color=460c40
|link=
|image=
|text-align=left
|top font-size= 1.1em
|top color=FFF
|line color=FFF
|top text=This page is a work in progress.
|bottom font-size= 1em
|bottom color= FFF
|bottom text=
|line= none
}}</div></div>
<div style="clear:both;"></div>
<!--
*Database concepts (re-use slides)
*Introduction to Wikipedia data
*Introduction to MySQL and Quarry
*Querying with Socrata SOQL API
-->
== Lecture 1 ==
*reading/writing tsv files
*reading/writing csv files
== Lecture 2 ==
[[File:Highfivekitten.jpeg|200px|thumb|In which you learn how to use Python and web APIs to meet the likes of her!]]
[[File:Highfivekitten.jpeg|200px|thumb|In which you learn how to use Python and web APIs to meet the likes of her!]]


;Introduction and context
===Introduction and context ===


* You can write some tools in Python now. Congratulations!
You can manipulate data in Python now. Congratulations! Today we'll learn how to put your new skills to good use. You'll learn how to gather data from the internet, using APIs (Application Programming Interfaces) and JSON ("JavaScript Object Notation"). You can manipulate this data in Python to answer questions (just like you did with the BabyNames, Wordplay, and Seattle Traffic datasets), and then export your findings to a file for later use or additional analysis.
* Today we'll learn how to find/create data sets
* Next week we'll get into data science (asking and answering questions)




;Outline:
=== Outline ===


;Lecture 1 (Ray)
* What is an API?
* What is an API?
* How do we use one to fetch interesting datasets?
* How do we use one to fetch interesting datasets?
* How do we write programs that use the internet?  
* How do we write programs that use the internet?  
;API exercise 1 (Jonathan)
* How can we use the placekitten API to fetch kitten pictures?
* How can we use the placekitten API to fetch kitten pictures?
;Lecture 2 (Ray)
* Introduction to structured data (JSON)
* Introduction to structured data (JSON)
* How do we use APIs in general?
* JSON and Python


;API exercise 2 (Jonathan)
*How do we build API queries to gather data about Wikipedia articles and editors?


;What is a (web) API?
== Lecture 1: What is an API? ==


* API: a structured way for programs to talk to each other (aka an interface for programs)
* API: a structured way for programs to talk to each other (aka an interface for programs)
* Web APIs: like a website your programs can visit (you:a website::your program:a web API)
* Web APIs: like a website your programs can visit (you:a website::your program:a web API)


=== Download the Week 5 lecture scripts and data ===


; How do we use an API to fetch datasets?
<big>'''[http://jtmorgan.net/ds4ux/week5/lecture.zip Click here to download the week 5 lecture scripts and data examples]'''</big>
 
# The ".zip" extension on the above file indicates that it is a compressed Zip archive. We need to "extract" its contents.
# Start up your terminal, navigate to the new directory you have unpacked called <code>lecture</code>
 
=== How do we use an API to fetch datasets? ===


Basic idea: your program sends a request, the API sends data back
Basic idea: your program sends a request, the API sends data back
Line 70: Line 47:
** Structured data (usually in the JSON format)
** Structured data (usually in the JSON format)
* How do you understand (i.e. parse) the data?  
* How do you understand (i.e. parse) the data?  
** There's a module for that!
** There's a module for that! (introduce requests)
 
 
; How do we write Python programs that make web requests?
 
To use APIs to build a dataset we will need:
* all our tools from last session: variables, etc
* the ability to open urls on the web
* the ability to create custom URLS
* the ability to save to files
* the ability to understand (i.e., parse) JSON data that APIs usually give us
 
 
; New programming concepts:
 
* interpolate variables into a string using % and %()s
* requests
* open files and write to them




; How do we use an API to fetch kitten pictures?


== Exercise 1: How do we use APIs to get kitten pictures? ==
[http://placekitten.com/ placekitten.com]
[http://placekitten.com/ placekitten.com]
* API that takes specially crafted URLs and gives appropriately sized picture of kittens
* API that takes specially crafted URLs and gives appropriately sized picture of kittens
Line 98: Line 58:
** kittens of different sizes
** kittens of different sizes
** kittens in greyscale or color
** kittens in greyscale or color
* Now we write a small program to grab an arbitrary square from placekitten by asking for the size on standard in: [http://mako.cc/teaching/2014/cdsw-autumn/placekitten_raw_input.py placekitten_raw_input.py]
* Now we are ready to walk through a small program (<code>placekitten_input.py</code>) that grabs a random picture from placekitten.com of a user-specified height and width, and saves that image as a .jpg image on your computer.
 
== Lecture 2: Using data from APIs in Python ==




; Introduction to structured data (JSON, JavaScriptObjectNotation)
=== How do we write Python programs that make web requests? ===
 
To use APIs to build a dataset we will need:
* all our tools from last session: variables, etc
* the ability to open urls on the web
* the ability to create custom URLS
* the ability to save to files
* the ability to understand (i.e., parse) JSON data that APIs usually give us
 
=== Introduction to structured data (JSON, JavaScriptObjectNotation) ===


* what is json: useful for more structured data
* what is json: useful for more structured data
* import json; json.loads()
* import json; json.load(), json.loads()
* like Python (except no single quotes)
* like Python (except no single quotes)
* simple lists, dictionaries
* simple lists, dictionaries
* can reflect more complicated data structures
* can reflect more complicated data structures
* Example file at http://mako.cc/cdsw.json
* You can parse data directly with <code>.json()</code> on a <code>requests</code> call
* You can parse data directly with <code>.json()</code> on a <code>requests</code> call
* You can parse JSON files with <code>json.load()</code>
* You can parse JSON ''strings'' with <code>json.loads()</code> (the 's' is for 'string')
* Example JSON file: <code>family.json</code>
* script to parse this file into a CSV: <code>family_json_to_csv.py</code>
* script to parse and print a ''JSON string'' version of this example file: <code>parse_json.py</code>


; Using other APIs
=== Using other APIs ===


* every API is different, so read the documentation!
* every API is different, so read the documentation!
Line 122: Line 97:
* text encoding issues
* text encoding issues


== Other Potentially Resources ==
My friend Frances gave a version of this lecture last year and create slides. They are written for Python 2, so the code might not all work (remember, use <Code>print()</code> with parentheses) but the basic ideas might be helpful:
* [http://mako.cc/teaching/2014/cdsw-autumn/lecture2-web_apis.pdf Slides (PDF)] — For viewing
* [http://mako.cc/teaching/2014/cdsw-autumn/lecture2-web_apis.odp Slides (ODP Libreoffice Slides Format)] — For editing and modification
<!-- FROM http://wiki.communitydata.cc/Community_Data_Science_Workshops_(Spring_2015)/Day_3_Lecture
== Material for the lecture ==
For the lecture, you will need two files. Download both of these to your computer by using right or control click on the link and then using ''Save as'' or ''Save link as''. Keep track of where you put the files.
* http://mako.cc/teaching/2015/cdsw-spring/harrypotter-wikipedia-cdsw.zip
* http://communitydata.cc/~mako/hp_wiki.tsv
== Overview of the day ==
* Lecture
** Our philosophy around data visualization
** Introduce some new programming tools!
** We're going to walk through some analysis of edits to Harry Potter in Wikipedia, start to finish
** We'll focus on manipulating data in Python
** Visualizing things in Google Docs
* Project based work
** More [[Harry Potter on Wikipedia]] project (or your own topic) on doing analysis using Google Docs
** [[Matplotlib]]
** Civic Data - More interactive working on projects
== Lecture outline ==
'''Step 1: Pre-Requisites'''
* My philosophy about data analysis: ''use the tools you have''
* Four things in Python I have to teach you:
** while loops
*** infinite loops
*** loops with a greater than or less than
** break / continue
** "\t".join()
** defining your own functions with <code>def foo(argument):</code>
'''Step 2: Walking through a Program'''
* Walk-through of <code>get_hpwp_dataset.py</code>
* Look at dataset with <code>more</code> and/or in spreadsheet


'''Step 3: Loading Data Back In'''
== Week 5 project: How do we use APIs to get data from Wikipedia? ==
* Brief intro to Wikipedia
* Overview of what you can get from [[DS4UX_(Spring_2016)/Wikipedia_API|the Wikipedia API]]
* Example queries in the [https://en.wikipedia.org/wiki/Special:ApiSandbox Wikipedia API sandbox]
* Introduce [[DS4UX_(Spring_2016)/Day_5_coding_challenge|week 5 coding challenges]]


* Load data into Python
** review of opening files
*** we can also open them for reading with <code>open('file', 'r', encoding="utf-8")</code>
** csv.DictReader()
* Basic counting: <code>hpwp-minor.py</code>
** Answer question: ''What proportion of edits to Wikipedia Harry Potter articles are minor?''
*** Count the number of minor edits and calculate proportion
* Looking at time series data <code>hpwp-trend.py</code>
** "Bin" data by day to generate the trend line
* Exporting and visualizing data
** Export dataset on edits over time
** Export dataset on articles over users
** Load data into Google Docs
-->
[[Category:DS4UX (Spring 2016)]]
[[Category:DS4UX (Spring 2016)]]

Latest revision as of 21:06, 25 April 2016

In which you learn how to use Python and web APIs to meet the likes of her!

Introduction and context[edit]

You can manipulate data in Python now. Congratulations! Today we'll learn how to put your new skills to good use. You'll learn how to gather data from the internet, using APIs (Application Programming Interfaces) and JSON ("JavaScript Object Notation"). You can manipulate this data in Python to answer questions (just like you did with the BabyNames, Wordplay, and Seattle Traffic datasets), and then export your findings to a file for later use or additional analysis.


Outline[edit]

Lecture 1 (Ray)
  • What is an API?
  • How do we use one to fetch interesting datasets?
  • How do we write programs that use the internet?
API exercise 1 (Jonathan)
  • How can we use the placekitten API to fetch kitten pictures?
Lecture 2 (Ray)
  • Introduction to structured data (JSON)
  • JSON and Python
API exercise 2 (Jonathan)
  • How do we build API queries to gather data about Wikipedia articles and editors?

Lecture 1: What is an API?[edit]

  • API: a structured way for programs to talk to each other (aka an interface for programs)
  • Web APIs: like a website your programs can visit (you:a website::your program:a web API)

Download the Week 5 lecture scripts and data[edit]

Click here to download the week 5 lecture scripts and data examples

  1. The ".zip" extension on the above file indicates that it is a compressed Zip archive. We need to "extract" its contents.
  2. Start up your terminal, navigate to the new directory you have unpacked called lecture

How do we use an API to fetch datasets?[edit]

Basic idea: your program sends a request, the API sends data back

  • Where do you direct your request? The site's API endpoint.
  • How do I write my request? Put together a URL; it will be different for different web APIs.
    • Check the documentation, look for code samples
  • How do you send a request?
    • Python has modules you can use, like requests (they make HTTP requests)
  • What do you get back?
    • Structured data (usually in the JSON format)
  • How do you understand (i.e. parse) the data?
    • There's a module for that! (introduce requests)


Exercise 1: How do we use APIs to get kitten pictures?[edit]

placekitten.com

  • API that takes specially crafted URLs and gives appropriately sized picture of kittens
  • Exploring placekitten in a browser:
    • visit the API documentation
    • kittens of different sizes
    • kittens in greyscale or color
  • Now we are ready to walk through a small program (placekitten_input.py) that grabs a random picture from placekitten.com of a user-specified height and width, and saves that image as a .jpg image on your computer.

Lecture 2: Using data from APIs in Python[edit]

How do we write Python programs that make web requests?[edit]

To use APIs to build a dataset we will need:

  • all our tools from last session: variables, etc
  • the ability to open urls on the web
  • the ability to create custom URLS
  • the ability to save to files
  • the ability to understand (i.e., parse) JSON data that APIs usually give us

Introduction to structured data (JSON, JavaScriptObjectNotation)[edit]

  • what is json: useful for more structured data
  • import json; json.load(), json.loads()
  • like Python (except no single quotes)
  • simple lists, dictionaries
  • can reflect more complicated data structures
  • You can parse data directly with .json() on a requests call
  • You can parse JSON files with json.load()
  • You can parse JSON strings with json.loads() (the 's' is for 'string')
  • Example JSON file: family.json
  • script to parse this file into a CSV: family_json_to_csv.py
  • script to parse and print a JSON string version of this example file: parse_json.py

Using other APIs[edit]

  • every API is different, so read the documentation!
  • If the documentation isn't helpful, search online
  • for popular APIs, there are python modules that help you make requests and parse json

Possible issues:

  • rate limiting
  • authentication
  • text encoding issues


Week 5 project: How do we use APIs to get data from Wikipedia?[edit]