DS4UX (Spring 2016)/Day 5 lecture

Introduction and context
You can manipulate data in Python now. Congratulations! Today we'll learn how to put your new skills to good use. You'll learn how to gather data from the internet, using APIs (Application Programming Interfaces) and JSON ("JavaScript Object Notation"). You can manipulate this data in Python to answer questions (just like you did with the BabyNames, Wordplay, and Seattle Traffic datasets), and then export your findings to a file for later use or additional analysis.

Outline

 * Lecture 1 (Ray)
 * What is an API?
 * How do we use one to fetch interesting datasets?
 * How do we write programs that use the internet?


 * API exercise 1 (Jonathan)
 * How can we use the placekitten API to fetch kitten pictures?


 * Lecture 2 (Ray)
 * Introduction to structured data (JSON)
 * JSON and Python


 * API exercise 2 (Jonathan)
 * How do we build API queries to gather data about Wikipedia articles and editors?

Lecture 1: What is an API?

 * API: a structured way for programs to talk to each other (aka an interface for programs)
 * Web APIs: like a website your programs can visit (you:a website::your program:a web API)

Download the Week 5 lecture scripts and data
Click here to download the week 5 lecture scripts and data examples


 * 1) The ".zip" extension on the above file indicates that it is a compressed Zip archive. We need to "extract" its contents.
 * 2) Start up your terminal, navigate to the new directory you have unpacked called

How do we use an API to fetch datasets?
Basic idea: your program sends a request, the API sends data back
 * Where do you direct your request? The site's API endpoint.
 * For example: Wikipedia's web API endpoint is http://en.wikipedia.org/w/api.php
 * How do I write my request? Put together a URL; it will be different for different web APIs.
 * Check the documentation, look for code samples
 * How do you send a request?
 * Python has modules you can use, like  (they make HTTP requests)
 * What do you get back?
 * Structured data (usually in the JSON format)
 * How do you understand (i.e. parse) the data?
 * There's a module for that! (introduce requests)

Exercise 1: How do we use APIs to get kitten pictures?
placekitten.com
 * API that takes specially crafted URLs and gives appropriately sized picture of kittens
 * Exploring placekitten in a browser:
 * visit the API documentation
 * kittens of different sizes
 * kittens in greyscale or color
 * Now we are ready to walk through a small program that grabs a random picture from placekitten.com of a user-specified height and width, and saves that image as a .jpg image on your computer.

How do we write Python programs that make web requests?
To use APIs to build a dataset we will need:
 * all our tools from last session: variables, etc
 * the ability to open urls on the web
 * the ability to create custom URLS
 * the ability to save to files
 * the ability to understand (i.e., parse) JSON data that APIs usually give us

Introduction to structured data (JSON, JavaScriptObjectNotation)

 * what is json: useful for more structured data
 * import json; json.load, json.loads
 * like Python (except no single quotes)
 * simple lists, dictionaries
 * can reflect more complicated data structures
 * You can parse data directly with  on a   call
 * You can parse JSON files with
 * You can parse JSON strings with  (the 's' is for 'string')
 * Example JSON file:
 * script to parse this file into a CSV:
 * script to parse and print a JSON string version of this example file:

Using other APIs

 * every API is different, so read the documentation!
 * If the documentation isn't helpful, search online
 * for popular APIs, there are python modules that help you make requests and parse json

Possible issues:
 * rate limiting
 * authentication
 * text encoding issues

Week 5 project: How do we use APIs to get data from Wikipedia?

 * Brief intro to Wikipedia
 * Overview of what you can get from the Wikipedia API
 * Example queries in the Wikipedia API sandbox
 * Introduce week 5 coding challenges